count_enable
Newbie level 3
Hi, the problem is:
Created and trained the network with NNToolbox, exported the weights and biases. The network works perfect, using sim(net) or just net(x). But when I try to reproduce the result with pen and paper, using the same weights and biases, the output is completely different.
For testing I created a single layer net with PURELIN activation function. All weights and biases set to zero. According to all ANN books and wikipedias the result would be zero for all inputs, but Matlab says it's 0.5! If we set the weights to 1 and the biases to 0 (with PURELIN function the output must be the arithmetic sum of inputs), the result is even stranger: for [0;0] input it's -0.0611, for [1;1] it's 1.0605 and for [2;2] the output 2.1821. I am confused: I always believed Matlab and took its results "as is", but I can't explain that. Any ideas?
Created and trained the network with NNToolbox, exported the weights and biases. The network works perfect, using sim(net) or just net(x). But when I try to reproduce the result with pen and paper, using the same weights and biases, the output is completely different.
For testing I created a single layer net with PURELIN activation function. All weights and biases set to zero. According to all ANN books and wikipedias the result would be zero for all inputs, but Matlab says it's 0.5! If we set the weights to 1 and the biases to 0 (with PURELIN function the output must be the arithmetic sum of inputs), the result is even stranger: for [0;0] input it's -0.0611, for [1;1] it's 1.0605 and for [2;2] the output 2.1821. I am confused: I always believed Matlab and took its results "as is", but I can't explain that. Any ideas?