I've created an OCR with matlab's Neural Networks.
I've used traingdx
net.trainParam.epochs = 8000;
net.trainParam.min_grad = 0.0000;
net.trainParam.goa开发者_JAVA技巧l = 10e-6;
I've noticed that when I use different goals I get different results (as expected of course).
The weird thing is that I found that I have to "play" with the goal value to get good results. I expected that the lower you go, the better the results and recognition. But I found that if I lower the goal to like 10e-10 i actually get worse recognition results.Any idea why lowering the goal would decrease the correctness of the Neural Network ?
I think it might have something to do with it trying too hard to it right, so it doesn't work as well with noise and change.
My NN knowledge is a little rusty, but yes, training the network too much will overtrain it. This will make the network work better on the training vectors you give it, but will make it worse for different inputs.
This is why you generally train it on a set of training vectors and then test the quality with a set of testing vectors. You can do the training iteratively: train on the training set to a certain goal accuracy, then check results for your testing set, increase your goal accuracy and repeat. Stop training when your result on the testing set is worse than what you previously had.
精彩评论