<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Recent changes to 25: Possible Bug with Backprop</title><link>https://sourceforge.net/p/neuroph/bugs/25/</link><description>Recent changes to 25: Possible Bug with Backprop</description><atom:link href="https://sourceforge.net/p/neuroph/bugs/25/feed.rss" rel="self"/><language>en</language><lastBuildDate>Sat, 13 Oct 2012 12:37:43 -0000</lastBuildDate><atom:link href="https://sourceforge.net/p/neuroph/bugs/25/feed.rss" rel="self" type="application/rss+xml"/><item><title>Possible Bug with Backprop</title><link>https://sourceforge.net/p/neuroph/bugs/25/</link><description>&lt;div class="markdown_content"&gt;&lt;p&gt;Dear Developers&lt;/p&gt;
&lt;p&gt;I would like to make a request to you to check Multi Layer Perceptron that use Bias Neuron with Tanh transfer function and Backpropagation with momentum learning rule. &lt;/p&gt;
&lt;p&gt;I created and tested such NNet with 10 different training sets and test set. Total Mean Square Error are too similar. For example, if the first test give MSE of 0.111112345, no matter what train set and test set, I use later, the MSE will be something like 0.11111xxxx. I also suspect that the NNet was not update during train stage. I can reproduce the similar situation on two pc an one netbook. &lt;/p&gt;&lt;/div&gt;</description><dc:creator xmlns:dc="http://purl.org/dc/elements/1.1/">Anonymous</dc:creator><pubDate>Sat, 13 Oct 2012 12:37:43 -0000</pubDate><guid>https://sourceforge.net5f0f686d7dd9bf88e6580925178c2d517bdbfa90</guid></item></channel></rss>