I am writing my custom regression layer but my forward and backward loss function computation function needs to access some parameters. How do I track indexing of minibatches?
10 views (last 30 days)
I am writing my own custom output regression layer for my deep neural network. I have written the forward and backward loss functions (which need to access from parameters from my workspace (not a part of training data) for computation). This is similar to the folowing problem posted in questions.
I understand the two options given by the matlab staff answer i.e.,
- either use trainNetwork with my custom layer, or
- create a dlnetwork and custom training loop and define my loss function and then let dlgradient automatically calculate differentiation.
I cannot use first option because that would require me to keep track of the indices of the minibatches and I dont know how to do that in trainNetwork. I however can track the indices by designing a custom training loop and taking option 2 above. However, in taking that option I need to design my own dlgradient function, because again I would need to access those additional parameters that are not a part of training data in the calculation of backward loss funtion (gradient).
So my questions are:
- Can I redefine my own dlgradient function?
- If I take the option of designing my own custom regression layer with a custom forward and backward function, is there a way of keeping track of minibatch indices?
I ll really appreciate your guidance.