sanjay
Full Member level 1

Hi all,
The concept of noise for training a particular system for signal classification.
I am implementing a system, whereby the signals can and cannot be drowned in noise. From the network designed so far, I get excellent (well, almost excellent) results out from both training and simulation, when I have no noise to the inputs of the signals. however, when I try to check the output behaviour of the network while simulating the network with noise, My signal classification jst gets messed up. I have looked into the possibilities of overfitting and issues relating to that. But so far, haven't really encountered any success. Even trying different values for learning rate, momentum, change in the size of the network hasn't worked.
Can someone share their expertise comments regarding to the problem of adapting the neural network for signal classification with noise using method of backpropagation.
Regards
help would reallyyyyyyyyyyyyyy be appreciated
The concept of noise for training a particular system for signal classification.
I am implementing a system, whereby the signals can and cannot be drowned in noise. From the network designed so far, I get excellent (well, almost excellent) results out from both training and simulation, when I have no noise to the inputs of the signals. however, when I try to check the output behaviour of the network while simulating the network with noise, My signal classification jst gets messed up. I have looked into the possibilities of overfitting and issues relating to that. But so far, haven't really encountered any success. Even trying different values for learning rate, momentum, change in the size of the network hasn't worked.
Can someone share their expertise comments regarding to the problem of adapting the neural network for signal classification with noise using method of backpropagation.
Regards
help would reallyyyyyyyyyyyyyy be appreciated