clamp cross-entropy loss

10 visualizaciones (últimos 30 días)
Matt Fetterman
Matt Fetterman el 3 de Sept. de 2020
Comentada: Matt Fetterman el 6 de Sept. de 2020
the Matlab cross-entropy loss has this form:
loss = -sum(W*(T.*log(Y)))/N;
I would like to "clamp" it so that the log function output is bounded, for example it cannot be less than 100.
Can we do it?

Respuesta aceptada

David Goodmanson
David Goodmanson el 3 de Sept. de 2020
Editada: David Goodmanson el 6 de Sept. de 2020
Hi Matt,
z = log(Y);
z(z<100) = 100;
loss = -sum(W*(T.*z))/N;
In the link you provided, they talk about a limit of -100 rather than +100. The former appears to make more sense. Lots of possibilities for a smooth differentiable cutoff, here is one, assuming Y>=0
Ylimit = -100;
loss = -sum(W*(T.*log(Y+exp(Ylimit)))/N;
  3 comentarios
David Goodmanson
David Goodmanson el 6 de Sept. de 2020
Hi Matt,
see amended answer.
Matt Fetterman
Matt Fetterman el 6 de Sept. de 2020
probably a smart approach.

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre MATLAB en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by