Oscillation of classification accuracy of test images - VGG - transfer learning
4 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I am trying to transfer VGG19 with Matlab to perform sex recognition (based on face images of a non-human primate). I have 5,000 training images (1,500 females and 3,500 males) and I need to predict the sex for 4,000 test images (which I know are ALL females - actually I am intersted in the classification score), which I am also using for validation, every 20 itérations (to understand what is going on).
The network learns well, too well… and reaches 100% for the mini-batch accuracy. Set the dropout ratio to 0.7 rather than 0.5 in the two dropout layes of VGG does not change anything. Ok, why not. However the test images are assigned randomly (approx 50% after 50 epochs). Weirdly, at the beginning of training, the accuracy of test images oscillates between 20% and 80%. Assigning only 20% of images to females (again, which all represent females) cannot happen by chance, thus I initially thought that this is because the two training classes can be split along multiple dimensions (not only females versus males). But then why after multiple epochs the validation (= test) accuracy would stabilize around 50%? I do not understand what is going on. Here is what training monitoring looks like (PS: net has been already pretrained with one epoch before, using the same parameters, explaining why the mini-batch accuracy is already 90% at the first iteration)
and here is the code:
% Data augmentation
pixelRange = [-10 10];
rotRange = [-10 10];
scaleRange = [0.8 1.1];
% Network parameters
miniBatchSize = 42 %max that my GPU can take;
maxEpochs = 100;
initialLearnRate = 1e-3;
validationFrequency = 20;
% Load dataset
imdsLearning = imageDatastore(fullfile(sprintf(learningdata)),...
'IncludeSubfolders',true,'LabelSource','foldernames');
imdsTesting = imageDatastore(fullfile(sprintf(testingdata)),...
'IncludeSubfolders',true,'LabelSource','foldernames');
load net.mat
inputSize = net.Layers(1).InputSize;
imageAugmenter = imageDataAugmenter( ...
'RandXReflection',true, ...
'RandXTranslation',pixelRange, ...
'RandRotation',rotRange, ...
'RandYTranslation',pixelRange, ...
'RandScale',scaleRange);
augimdsTrain = augmentedImageDatastore(inputSize(1:2),imdsLearning, ...
'DataAugmentation',imageAugmenter);
augimdsTesting = augmentedImageDatastore(inputSize(1:2),imdsTesting);
options = trainingOptions('sgdm', ...
'ExecutionEnvironment','gpu', ...
'MiniBatchSize',miniBatchSize, ...
'MaxEpochs',maxEpochs, ...
'Shuffle','every-epoch' ,...
'InitialLearnRate',initialLearnRate, ...
'ValidationData',augimdsTesting, ...
'ValidationFrequency',validationFrequency, ...
'ValidationPatience',Inf, ...
'Verbose',true, ...
'Plots','training-progress', ...
'OutputFcn',@(info)stopIfAccuracyLevelReached(info,80));
net2 = trainNetwork(augimdsTrain,net.Layers,options);
0 comentarios
Respuestas (1)
Mahmoud Afifi
el 7 de Mayo de 2020
I think it can be improved with augmentation. Matlab provides different augmentation options. Also I refer you to our WB augmenter, published in the last ICCV conference. You can find easy-to-use Matlab code here with examples.
0 comentarios
Ver también
Categorías
Más información sobre Image Data Workflows en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!