clamp cross-entropy loss
10 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Matt Fetterman
el 3 de Sept. de 2020
Comentada: Matt Fetterman
el 6 de Sept. de 2020
the Matlab cross-entropy loss has this form:
loss = -sum(W*(T.*log(Y)))/N;
I would like to "clamp" it so that the log function output is bounded, for example it cannot be less than 100.
Can we do it?
0 comentarios
Respuesta aceptada
David Goodmanson
el 3 de Sept. de 2020
Editada: David Goodmanson
el 6 de Sept. de 2020
Hi Matt,
z = log(Y);
z(z<100) = 100;
loss = -sum(W*(T.*z))/N;
In the link you provided, they talk about a limit of -100 rather than +100. The former appears to make more sense. Lots of possibilities for a smooth differentiable cutoff, here is one, assuming Y>=0
Ylimit = -100;
loss = -sum(W*(T.*log(Y+exp(Ylimit)))/N;
3 comentarios
Más respuestas (0)
Ver también
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!