Data replication Neural Networks Matlab
1 visualización (últimos 30 días)
Mostrar comentarios más antiguos
Andreas
el 7 de Abr. de 2016
Comentada: Andreas
el 19 de Abr. de 2016
Hello world.! I have recently been studying neural networks, so I may ask something obvious, but I figured out that when I replicate my inputs and outputs and then train the network for pattern recognition,it has far more accuracy than with the original data. I thought of that in order to replicate some of the extreme values I have. Can that make my network overfit? Thank you everyone
0 comentarios
Respuesta aceptada
Greg Heath
el 10 de Abr. de 2016
Editada: Greg Heath
el 19 de Abr. de 2016
1. I don't understand your question.
2. a. OVERFITTING means there are more unknown weights, Nw, than independent training equations, Ntrneq ( i.e., Nw > Ntrneq).
b. OVERTRAINING an overfit net CAN LEAD to loss of performance on NONTRAINING data.
3. There are several remedies to prevent OVERTRAINING AN OVERFIT NET. So, in general, overfitting need not be disastrous.
4. Methods for preventing loss of generalization via overtraining an overfit net
a. Do not overfit: Nw < Ntrneq. Preferrably,
Ntrneq >> Nw which yields design Stability and
robustness w.r.t. noise and measurement error.
For example:
i. Increase the number of training examples
ii. Reduce the number of hidden nodes
b. Use VALIDATION STOPPING to prevent overtraining
c. Use the BAYESIAN REGULARIZATION
training function TRAINBR with MSEREG
as a default.
d. Replace the default performance function
MSE with the regularized
modification MSEREG
Hope this helps.
Thank you for formally accepting my answer
Greg
5 comentarios
Greg Heath
el 19 de Abr. de 2016
No.
You add the random noise to the replicated data. Just make sure that the resulting signal to noise ratio is sufficiently high.
Obviously, one good way to approach the problem is as a function of RNG state (which determines the initial random weights) and SNR, given the number of hidden nodes.
However, I have done the reverse, i.e., trained and validated with noisy duplicated data and then used the original data for testing. Again results are presented in terms of the parameter SNR.
Hope this helps.
Greg
Más respuestas (0)
Ver también
Categorías
Más información sobre Deep Learning Toolbox en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!