- ‘fitnet’ : It is a high level function that provides a convenient way to create and train a neural network with a single function call.
- ‘network’: This can be used to create customized shallow neural networks, it is a low level function that provides more control over the network architecture, and we can set the ‘trainingOptions’ in detail, so more work is required to setup and train network compared to ‘fitnet’
Shallow neural network parameters for training
7 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I want to build a shallow network with two hidden layers. I am quite confused between fitnet, net, network, and other commands.
I can reach a result, however, I can not modify it.
For example, the learning rate or activation functions do not change.
And the final question, is it possible to add mini-batch learning to gradient descent with MATLAB?
trainFcn = 'traingd';
net.numLayers = 2
hiddenLayerSize = [6 36];
net.trainParam.lr = 0.05;
net.layers{1}.transferFcn = 'tansig';
net.layers{2}.transferFcn = 'logsig';
% net.layers{1}.transferFcn = 'reluLayer';
net = fitnet(hiddenLayerSize,trainFcn);
0 comentarios
Respuestas (1)
Sanjana
el 3 de Mzo. de 2023
Hi Mohammad,
In MATLAB, ‘fitnet’ and ‘network’ are both functions that can be used to create and train neural networks, but they differ in their flexibility and ease of use.
Please, refer to the following documentations for further details,
Please, refer to the following documentation, to understand how to edit the Shallow neural network properties:
https://www.mathworks.com/help/deeplearning/ug/create-and-train-custom-neural-network-architectures.html
As for the learning rate, you can specify it using ‘trainingOptions’ object, refer to the following example,
Example:
% Create a shallow neural network with one hidden layer
net = feedforwardnet([10]);
% Set the training options
options = trainingOptions('trainlm', 'MaxEpochs', 50, 'LearningRate',0.01);
% Train the neural network using the specified options
net = train(net, inputs, targets, options);
And, it is not possible to use mini-batch learning with the ‘traingd’ Optimizer. The 'traingd' optimizer is a batch gradient descent algorithm that updates the network weights using the gradients computed over the entire training dataset.But,it is possible to use mini-batch learning with other optimizers when working with shallow neural networks in MATLAB. The ‘train’ function supports mini-batch learning with the 'trainscg' and 'traingdm' optimizers, which are both based on gradient descent.
Hope this helps!
Ver también
Categorías
Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!