In a FeedForward NNet, what exactly is one iteration?

1 visualización (últimos 30 días)
Sam Speake
Sam Speake el 23 de Mayo de 2018
Comentada: Greg Heath el 25 de Mayo de 2018
When you train a feedforward neural net with no changes, you see a GUI which includes "Epoch: 0 [ x iterations ] 1000" Does the x value represent the amount of pieces of data that were passed (such as 1 image from a data set of images), or does it represent a full pass of the entire data set?

Respuesta aceptada

Majid Farzaneh
Majid Farzaneh el 24 de Mayo de 2018
Hello, In every neural network there is an optimization algorithm to set optimum weights and biases; and optimization algorithms are usually iterative. 1 epoch means one iteration in the optimization algorithm.
  3 comentarios
Majid Farzaneh
Majid Farzaneh el 24 de Mayo de 2018
Yes, that's true. In every change for weights, network needs to calculate MSE and for MSE it needs to classify all training data with new weights.
Greg Heath
Greg Heath el 25 de Mayo de 2018
Optimization algorithms TRY to optimize the goal. Many/most times they do not achieve the goal.
Nevertheless, they are often considered successful if they just get close enough.
For example, I often design neural networks to yield an output target t, given an input function x.
I take as a reference output
yref = mean(t')
the corresponding mean square error is
MSEref = mean(var(t',1))
My training goal is typically
MSEgoal = 0.01*MSEref
which preserves 99% of the target variance,

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Deep Learning Toolbox en Help Center y File Exchange.

Productos

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by