Using pattern neural network's weights in my own forward propagation code
Mostrar comentarios más antiguos
Hello,
I have a problem with using the weights of a pattern net. What I'm doing is training the net with my own inputs and targets and after that I am testing the net with a different input. Until then everything is OK, in other words, I'm getting the correct answer from the net. Hoewever what I want to do is to use the net's input weights and layer weights as parameters for my implementation of the forward propagation (just by making dotproducts). However, while coding I noticed the pattern net doesn't use the common sigmoidal function, but it uses the tansig() function instead and also the layers have some properties of matlab. Well, finally I coded the forward propagation just like this:
function [ o ] = fwprop(input, IW, b1, LW, b2)
a = dotprod(IW,input);
a = netsum({a, b1});
a = tansig(a);
o = dotprod(LW,a);
o = netsum({o, b2});
o = tansig(o);
end
where:
input: test input. IW: Input Weights. LW: Layer Weights. b1: bias vector 1. b2: bias vector 2.
I'm using dotproduct and netsum because I noticed that the pattern net uses exactly that functions. So, even though I'm using that functions, I continue getting the same wrong results. I wonder if there are some modification in the way that MATLAB computes the forward propagation.
Thanks in advance.
1 comentario
Greg Heath
el 8 de Jul. de 2013
Never use the same variable on both sides of an equation. It is VERY CONFUSING: The question what are a and o depends on what part of the program you are dealing with ... a VERY dangerous practice for several obvious reasons: old and/or long code understanding and modifications.
Respuesta aceptada
Más respuestas (1)
Clayder Gonzalez Cadenillas
el 9 de Jul. de 2013
0 votos
Categorías
Más información sobre Deep Learning Toolbox en Centro de ayuda y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!