Bias Variance Dilemna in Neural Network

Status
Not open for further replies.

sanjay

Full Member level 1
Joined
Jul 4, 2003
Messages
98
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Activity points
1,012
Hi all,

I am trying to understand the concept of Bias - Variance Tradeoff in Neural Network, I can understand the maths and the way they derive the equations, but I am not unable to understand when literatures, try to explain the tradeoff issue.

E{MSE} = Var{Noise} + Bias² + Var{Output Value}

I understand that we want to minimize both bias and variance, but, when literatures mention that Increasing Variance, would alter bias, and vice versa, I am not able to follow the explanation as I personally find it poorly explained (maybe coz I dont understand it).. Can anyone in simple words, just explain the concept..or provide examples that can illustrate this (would be great help)

Also, how can by applying regularization techniques (weight decay,etc) or adding noise to the input data set, form an "ACCEPTABLE SOLUTION" to minimize both bias and variance.

The document attached, is the one I found most simpleeeee to read from all the literatures I have come across so far..

Regards
 

Status
Not open for further replies.

Similar threads

Cookies are required to use this site. You must accept them to continue using the site. Learn more…