Borrar filtros
Borrar filtros

NN training process?

2 visualizaciones (últimos 30 días)
Raza Ali
Raza Ali el 18 de Mayo de 2020
Comentada: Raza Ali el 22 de Mayo de 2020
why mini batch accuracy (value) graph of training is goes down during training process?

Respuesta aceptada

Shishir Singhal
Shishir Singhal el 22 de Mayo de 2020
Mini batch accuracy should likely to increase with no. of epochs.
But for your case, there can be of multiple reasons behind this:
  • Mini-batch size
  • Learning rate
  • cost function.
  • Network Architechture
  • Quality of data and lot more.
It would be better if you provide more information about the NN model you are using.
If your case is similar like that.
  1 comentario
Raza Ali
Raza Ali el 22 de Mayo de 2020
Thank you for your reply.
plz see the detail below
Network = [
imageInputLayer([256 256 3],"Name","imageinput")
convolution2dLayer([3 3],32,"Name","conv_1","BiasLearnRateFactor",2,"Padding","same")
convolution2dLayer([3 3],64,"Name","conv_2","BiasLearnRateFactor",2,"Padding","same")
transposedConv2dLayer([3 3],2,"Name","transposed-conv","Cropping","same")
options = trainingOptions('sgdm', ...
'Momentum',0.9, ...
'InitialLearnRate',1e-3, ...
'L2Regularization',0.005, ...
'MaxEpochs',30, ...
'MiniBatchSize',2, ...
'Shuffle','every-epoch', ...

Iniciar sesión para comentar.

Más respuestas (0)


Más información sobre Deep Learning Toolbox en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by