How to use a custom transfer function in neural net training
Mostrar comentarios más antiguos
I want to use a function similar to tansig. I don't seem to be able to find a good example, and the tansig.apply method only allows me one line! I'm wrapped around this axle, and I suspect I'm missing something simple. Any ideas? I'm using 2012b.
Respuesta aceptada
Más respuestas (5)
Bob
el 27 de Mzo. de 2013
4 comentarios
Nn Sagita
el 29 de Ag. de 2013
Bob, I modified purelin transfer function, called 'mtf'. I saved in my working directory. I trained neural network and got outputs. But I got some messages too, like this:
Exception in thread "AWT-EventQueue-0" java.lang.NullPointerException at com.mathworks.toolbox.nnet.v6.diagram.nnTransfer.paint(nnTransfer.java:35) at com.mathworks.toolbox.nnet.v6.image.nnOffsetImage.paint(nnOffsetImage.java:49) at ....
Could you help me, what should I do?
kelvina
el 15 de Feb. de 2014
thanks bob, it helps me
but we can directly do this by coping file 'template transfer' from :C:\Program Files (x86)\MATLAB\R2010a\toolbox\nnet\nnet\nncustom
and just replace its function :a = apply_transfer(n,fp) by your function and then save this file in your working directory. it will work.
Mayank Gupta
el 4 de Mayo de 2016
Can you please explain in detail how to save a custom training function to the nntool directory ? I am using Firefly algorithm for optimization.
Mehdi Jokar
el 16 de Jul. de 2018
Bob, thank you for you instructions. but, is apply the only function that needs to be modified? or we need to modify the backprop and forwardprop function in the + folder ?
Mehdi
Bob
el 10 de Dic. de 2012
0 votos
Greg Heath
el 11 de Dic. de 2012
Editada: DGM
el 23 de Feb. de 2023
I cannot understand why you think y2 is better than y1
x = -6:0.1:6;
y1 = x./(0.25+abs(x));
y2 = x.*(1 - (0.52*abs(x/2.6))) % (for -2.5<x<2.5).
figure
hold on
plot(x,y1)
plot(x,y2,'r')
mladen
el 26 de Mzo. de 2013
0 votos
Could anybody upload some examples of modified tansig.m and +tansig folder? This would be very helpful for my project and for other people too. Thank You.
1 comentario
Nn Sagita
el 29 de Ag. de 2013
If you have some examples how to modify transfer function, please share for me. Thank you.
mladen
el 29 de Mzo. de 2013
Thank you Bob. Nice trick with feedforwardnet.m (good for permanent use). I've managed to do this but some new questions arise:
- How to use param in apply(n,param) ? (more info-> matlabcentral/answers/686)?
- How to use different transfer functions within the same layer?
- My apply function looks something like this:
function A = apply(n,param)
%....
A=a1.*a2;
end
now I would like to use a1 and a2 to speedup the derivative computation in da_dn.m (this has already been done with tansig.m, but with the final value (A in my code))...is it possible?
Categorías
Más información sobre Deep Learning Toolbox en Centro de ayuda y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!