Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Strange results with Matlab Neural Network Toolbox

Status
Not open for further replies.

count_enable

Newbie level 3
Joined
Apr 18, 2012
Messages
3
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,312
Hi, the problem is:
Created and trained the network with NNToolbox, exported the weights and biases. The network works perfect, using sim(net) or just net(x). But when I try to reproduce the result with pen and paper, using the same weights and biases, the output is completely different.
For testing I created a single layer net with PURELIN activation function. All weights and biases set to zero. According to all ANN books and wikipedias the result would be zero for all inputs, but Matlab says it's 0.5! If we set the weights to 1 and the biases to 0 (with PURELIN function the output must be the arithmetic sum of inputs), the result is even stranger: for [0;0] input it's -0.0611, for [1;1] it's 1.0605 and for [2;2] the output 2.1821. I am confused: I always believed Matlab and took its results "as is", but I can't explain that. Any ideas?
 

Hey, I have the same problem with patternnet. Did you solve it?
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top