How do I access individual weight matrices in a custom training loop?

5 visualizaciones (últimos 30 días)
Suppose I have a basic feedforward network of the type 3-layer network function equaition, minimising a loss function with respect to targets .
I would like to add a penalty term that regularises an intermediate signal, such as .
Could I define a custom loss function to do this? How can I access individual weight matrices in a custom training loop?

Respuestas (1)

Abhijit Bhattacharjee
Abhijit Bhattacharjee el 19 de Mayo de 2022
I assume by custom training loops, you are referring to custom training loops in the Deep Learning Toolbox. If so, here is a good resource to start with: Define Model Loss Function for Custom Training Loop.
There are two different methods to access the weight arrays, depending on how you defined the neural network in the custom training loop. If you used a dlnetwork, then you can access the weights of any layer that has learnable parameters.
Example for dlnetwork:
% Create feedforward neural network
layers = [
featureInputLayer(1,"Name","featureinput")
fullyConnectedLayer(10,"Name","fc_1")
sigmoidLayer("Name","sigmoid_1")
fullyConnectedLayer(10,"Name","fc_2")
sigmoidLayer("Name","sigmoid_2")
fullyConnectedLayer(10,"Name","fc_3")
sigmoidLayer("Name","sigmoid_3")];
dlnet = dlnetwork(layers);
% Access weights of the first FC layer
dlnet.Layers(2).Weights
ans = 10×1
0.6758 -0.3062 -0.2549 -0.5748 0.3081 0.4561 0.4425 0.6709 -0.1209 -0.6227
If you used a model function in the custom training loop instead, then you would need to define a parameters array that contains the weights and biases of the model upfront. You can access these parameters at any time using the names you already defined.

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Productos


Versión

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by