How to use softmax, Loss function(negative log probability) in classification
4 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Kong
el 2 de Abr. de 2020
Respondida: Shishir Singhal
el 7 de Abr. de 2020
Hello.
I want to classify videos.
After computation of eucldean distance, I want to use softmax and Loss function(negative log probability) for classification.
Can I get some idea to make the code?
clear all
close all
data = csvread('outfile.csv');
values = data(:,1:end-1);
labels = data(:,end);
avg = splitapply(@(x) mean(x,1), values, labels+1);
mean_class1 = avg(1,:);
mean_class2 = avg(2,:);
mean_class3 = avg(3,:);
mean_class4 = avg(4,:);
mean_class5 = avg(5,:);
bend_query = values(1,:);
run_query = values(2,:);
walk_query = values(3,:);
skip_query = values(4,:);
wave_query = values(5,:);
% calculate euclidean distance
euclidean_bend = pdist2(mean_class1, bend_query, 'euclidean');
euclidean_run = pdist2(mean_class2, run_query, 'euclidean');
euclidean_walk = pdist2(mean_class3, walk_query, 'euclidean');
euclidean_skip = pdist2(mean_class4, skip_query, 'euclidean');
euclidean_wave = pdist2(mean_class5, wave_query, 'euclidean');
0 comentarios
Respuesta aceptada
Shishir Singhal
el 7 de Abr. de 2020
For classification,
softmax creates probability scores for each category.
since your predictions and targets follows different probability distributions. You can use cross entropy loss for that. It is kind of negative log probability function.
Refer to this documentation for the implementation: https://www.mathworks.com/help/deeplearning/ref/dlarray.crossentropy.html
0 comentarios
Más respuestas (0)
Ver también
Categorías
Más información sobre Image Data Workflows en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!