reproducibility of results using neural networks

1 visualización (últimos 30 días)
Multi Vac
Multi Vac el 12 de Mayo de 2011
I am using 'newff' to create a neural network, 'trainParam' to set its parameters, and 'train' to train it. The problem is that it uses random initial weight values during each time the function is used so that I get different convergence results at different runs using the same data. How do I get reproducibility of the results?

Respuestas (3)

Walter Roberson
Walter Roberson el 12 de Mayo de 2011
Provide enough training data that the random initial weights have no impact. Or don't use random initial weights.
  1 comentario
Greg Heath
Greg Heath el 26 de Nov. de 2011
I assume by reproducibility, the OP means exactly the same weights and thresholds. There are many local minima in weight space.
For an I-H-O FFMLP each solution is equivalent to 2^H * H! -1 other solutions obtained by changing weight signs (2^H) and/or reshuffling the order of the hidden nodes (H!)
Therefore, reproducibility requires using the same initial state of rand before creating the net via newff.
Hope this helps.
Greg

Iniciar sesión para comentar.


Flo Trentini
Flo Trentini el 23 de Nov. de 2011
I am using 'initzero' for input , layers and biases weights, then i use net = init(net) before training the network. And yet each run gives me different results. How is that possible ?

Greg Heath
Greg Heath el 26 de Nov. de 2011
newff automatically uses rand and initnw.
Therefore, all you have to do is initialize rand before calling newff.
Hope this helps.
Greg

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by