Setting up a 3 layered back-propagation neural network

4 visualizaciones (últimos 30 días)
stayfrosty
stayfrosty el 28 de Jun. de 2016
Editada: stayfrosty el 6 de Jul. de 2016
I'm trying to set up a neural network with the following requirements -
  • three-layered;
  • feed-forward;
  • classical tan-sigmoid and linear functions in the hidden and output layers, respectively;
  • 5 neurons in the hidden layer;
  • trained with the Levenberg-Marquardt back-propagation algorithm
  • converges in 5 iterations
Basically, the neural network is to be trained by giving an RGB map input (3 values) and target output skin parameters (3 values). I've tried using the 'nntool' Matlab wizard and but am unsure if 'nftool' is the one I'm looking for. Unsure, because it says it's 2 layered and there's no option to make it converge in 5 iterations.
I'm new to setting up neural networks as that really isn't the main focus of my project. My question is - is the 'nftool' wizard the thing I'm after and are there settings in it that meet my listed neural network requirements? If not, is there some sort of coding template I can alter to create and train my neural network?
  2 comentarios
José-Luis
José-Luis el 28 de Jun. de 2016
Editada: José-Luis el 28 de Jun. de 2016
How could you make it converge in five iterations? Convergence is not something you can impose. On the other hand, you could try and make it quit after five iterations.
stayfrosty
stayfrosty el 28 de Jun. de 2016
Well, I'm trying to reproduce the results of a study and it did say the network "converges in 5 iterations". All I'm attempting to do is reproduce the results according to the details given. I guess, apart from that detail, is the 'nftool' the Matlab tool I should be using?

Iniciar sesión para comentar.

Respuestas (1)

Greg Heath
Greg Heath el 2 de Jul. de 2016
Only hidden and output nodes are considered being in neuron layers because they are associated with non-identity transfer functions. Input nodes are only considered to be fan-in units, not neurons. Therefore, although there are three layers of nodes, you have a two-layer network because there are only two layers of neurons.
You cannot duplicate designs without knowing the random number seed from which random initial weights and random trn/val/tst data division are obtained.
If you want to use the GUI, the fitting tool nfttool is appropriate.
However, I prefer the command line approach similar to the examples in the HELP and DOC documentation and the zillions of examples I have posted in both the NEWSGROUP & ANSWERS.
help fitnet
doc fitnet
Hope this helps.
Thank you for formally accepting my answer
Greg
  1 comentario
stayfrosty
stayfrosty el 6 de Jul. de 2016
Editada: stayfrosty el 6 de Jul. de 2016
Thank you for your reply. Could you point me to examples which you think are relevant to my dilemma? Here is a video link to the study whose neural network I was initially trying to replicate.
When I first undertook this as a project, I didn't expect to have to mess around with neural networks. Unfortunately, to reproduce/replicate the findings of this study, it looks like a must do.

Iniciar sesión para comentar.

Categorías

Más información sobre Deep Learning Toolbox en Help Center y File Exchange.

Productos

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by