sanjay
Full Member level 1
Hi all,
I am trying to understand the concept of Bias - Variance Tradeoff in Neural Network, I can understand the maths and the way they derive the equations, but I am not unable to understand when literatures, try to explain the tradeoff issue.
E{MSE} = Var{Noise} + Bias² + Var{Output Value}
I understand that we want to minimize both bias and variance, but, when literatures mention that Increasing Variance, would alter bias, and vice versa, I am not able to follow the explanation as I personally find it poorly explained (maybe coz I dont understand it).. Can anyone in simple words, just explain the concept..or provide examples that can illustrate this (would be great help)
Also, how can by applying regularization techniques (weight decay,etc) or adding noise to the input data set, form an "ACCEPTABLE SOLUTION" to minimize both bias and variance.
The document attached, is the one I found most simpleeeee to read from all the literatures I have come across so far..
Regards
I am trying to understand the concept of Bias - Variance Tradeoff in Neural Network, I can understand the maths and the way they derive the equations, but I am not unable to understand when literatures, try to explain the tradeoff issue.
E{MSE} = Var{Noise} + Bias² + Var{Output Value}
I understand that we want to minimize both bias and variance, but, when literatures mention that Increasing Variance, would alter bias, and vice versa, I am not able to follow the explanation as I personally find it poorly explained (maybe coz I dont understand it).. Can anyone in simple words, just explain the concept..or provide examples that can illustrate this (would be great help)
Also, how can by applying regularization techniques (weight decay,etc) or adding noise to the input data set, form an "ACCEPTABLE SOLUTION" to minimize both bias and variance.
The document attached, is the one I found most simpleeeee to read from all the literatures I have come across so far..
Regards