Possible Bug with Backprop
Brought to you by:
sevarac
Dear Developers
I would like to make a request to you to check Multi Layer Perceptron that use Bias Neuron with Tanh transfer function and Backpropagation with momentum learning rule.
I created and tested such NNet with 10 different training sets and test set. Total Mean Square Error are too similar. For example, if the first test give MSE of 0.111112345, no matter what train set and test set, I use later, the MSE will be something like 0.11111xxxx. I also suspect that the NNet was not update during train stage. I can reproduce the similar situation on two pc an one netbook.
Most likely this happens due Flat Spot correction
http://www.heatonresearch.com/wiki/Flat_Spot_Problem
We needed this in order to make Resilient propagation work.