How to Realize 'Gradient Reversal Layer' ?
8 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Percy Hu
el 16 de Jun. de 2021
Comentada: Percy Hu
el 25 de Jun. de 2021
How can i complete a 'Gradient Reversal Layer' in matlab like in pytorch or tensorflow?
It is normally used in transfer learning network when a GAN-like loss is adopted.
Could i realize it by define a custom layer?
It is very grateful if you can offer an example of some detailed advice. Thank you for your help.
2 comentarios
Respuesta aceptada
Philip Brown
el 21 de Jun. de 2021
It looks like you should be able to do this by writing your own custom layer. See the "Intermediate Layer Template" for some code to get started.
There's a custom layer used in a visualization example which does something a little bit similar (to modify the behavior of a ReLU gradient), here.
I think the custom layer code you need looks something like this:
classdef GradientReversalLayer < nnet.layer.Layer
methods
function Z = predict(layer, X)
Z = X; % Identity
end
function dLdX = backward(layer, X, dLdZ)
dLdX = -dLdZ; % Reverse gradient
end
end
end
If you want to define the constant you multiple the gradient by, you could make it a property of the custom layer and include that in your backward function.
Más respuestas (0)
Ver también
Categorías
Más información sobre Image Data Workflows en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!