Determine whether an integer is a palindrome. Do this without extra space.

Some hints:
Could negative integers be palindromes? (ie, -1)

If you are thinking of converting the integer to string, note the restriction of using extra space.

You could also try reversing an integer. However, if you have solved the problem “Reverse Integer”, you know that the reversed integer might overflow. How would you handle such case?

There is a more generic way of solving this problem.

Solution:
Considering get the first half, and second half

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
bool isPalindrome(int x) {
if(x<0) //negative numbers are not palindrome
return false;
if(x % 10==0 && x!=0) // 10,200,1000 are the extreme cases need to be exclued
return false;
int y = 0;
while(x>y)
{
y = y * 10 + x % 10;
x = x / 10;
}
bool flag = (x==y || x==y/10);
return flag;


}

This paper addresses the visualisation of image classification models. We consider two visualisation techniques, based on computing the gradient of the class score with respect to the input image. The first one generates an image, which maximises the class score, thus visualising the notion of the class, captured by a ConvNet. The second technique computes a class saliency map, specific to a given image and class.

Read More

DNN framework comparison

李沐: 我的理解是 caffe/cxxnet是适合需要用dnn来跑自己应用的人,准备好数据,使用或者稍微改下已有的网络配置就能跑。torch/minerva/theano 适合折腾dnn的人,可以很方便写个脚本跑起来。(purine还来得及看)。从实现技术上每个库都有自己的特点,这可以写长文了

yangqing: cudaconv偏hacky,caffe相对比较注重稳定,minerva不很熟悉,cxxnet和purine偏重lightweight + cutting edge,DIGITS是个wrapper所以和其他不是很一样。话说我超喜欢cxxnet的template和purine的operator,我自己的code也在用… 又话说caffe现在的interface的确不太能忍 :P

Alex Smola: See this short video in youtube Points:

  • Never use Matlab Libraries for dnn
  • Never start from scratch(unless you want to learn how it is done)
  • Torch, theano is good but slow, caffe is kind of good, Minerva should be one of the bests now.

Define the Network

  1. Giving the same name to the bottom and top blobs to do in-place operations to save some memory (ReLU)
  2. For the input, it creates two top layers(label and data). For the loss layer, it uses two input layers(label and fully connected)
  3. Layer definitions can include rules for whether and when they are included in the network definition, like the one below:

Read More