Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

implementation of backprop using tansig doesn't work proper

Status
Not open for further replies.

meto

Junior Member level 1
Joined
Apr 9, 2006
Messages
18
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,399
tansig

Hi I'm implementing an NN using Matlab to learn the XOR problem: -

in in out
-1 -1 -1
1 -1 1
-1 1 1
1 1 -1

Its a multi-layer feedforward network, and the neurons are either unipolar 'logsig' or bipolar 'tansig' neurons.

It learns succesfully using logsig neurons, but with tansig the root mean square error converges to 0.5 and levels out there.
As far as I can tell all my calculations are correct(!), i've checked it by hand using a calculator, and created a crude model in excel - all the calculations match up.

I cannot for the life of me figure out why it does this with tansig when it trains successfully with logsig.

Can anyone offer any insights?

thanks
 

matlab tansig

My experience shows that logsig are generally more usefull than tansig. I have encountered many problems where tansig can't fit while logsig are ok.
However, for your problem (XOR) you should use "hardlim" transfer function.
You should consider the overfitting problem all the time.
 

tansig network

--------------------------------------------------------------------------------
Thank u your reply but my data set values are btwn -1 and 1 so i must use tansig.If i use matlab toolbox (chosen tansig),there isn't any problem .Thus tansig must work my matlab code.
 

tansig transfer function

It is quite strange a XOR problem with non boolean data!
The input data range is not important. If your problem have a real output data you can scale them to be within 0 and 1 range.
You may try also tha "radbas" function. It is quite good from statistical point of view (if you have enough data)
 

tansig matlab

Theorically in book, bipolar data gives the best result, especially for these easy application like XOR problem. Anyway, I found that is not always true, especially when the problem is complicated.

Not every activation function can train the network. In your question, you said that using Matlab NN toolbox can get the result, then when you get problem? Is it you code the NN yourself without using toolbox?
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top