Borrar filtros
Borrar filtros

Neural network - Why are the outputs not within -1 and 1 when i apply tansig as the activation function in the output layer?

3 visualizaciones (últimos 30 días)
I got outputs greater than 1 (it ranges from 0.sth to 11.sth) when i use tansig as the activation function in the output layer. My neural network has the architecture of (4,6,5,1).
  1 comentario
Vishnu
Vishnu el 16 de Jun. de 2023
Hi JUN HANG,
Whatever the input to the "tansig" function, output should be in the range [-1,1].
Because the equation of the "tansig" is :
tansig(x) = (2/(1+exp(-2*x)))-1;
I suggest you to try it by normalizing the input values and weights of the network. If it still gives the output beyond the expected range, you can attach your neural network here I will look into it.

Iniciar sesión para comentar.

Respuestas (1)

Krishna
Krishna el 4 de En. de 2024
Hello OOI JUN HANG,
From what I gather, you're having trouble in achieving outputs in the interval of [-1, 1] with the tansig function. The 'tansig' activation function is designed to yield results that always fall between -1 and 1, irrespective of the architecture it's applied to. Formula for tansig is,
tansig(x) = 2/(1+exp(-2*x)) – 1 = (1 – exp(-2*x))/(1+exp(-2*x)) ---- (2)
now if you just take the reciprocal of exp(-2*x) in both numerator and denominator we get,
tansig(x) = (exp(2*x)-1) / (exp(2*x) +1) ---- (3)
Now when x tends to infinity tansig(x) tends to 1 as exp(-2*x) becomes zero and we are left with (1/1) (see equation 2).
When x tends to -1*infinity tansig(x) tends to -1 as exp(2*x) becomes zero and we are left with (-1/1) (see equation 3).
This is why the range for tansig is [-1,1]. For more information, please go through this documentation,
Hope this helps.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by