Bounding Box Not Drawn/Some Variables Are Empty

Hi Professionals,
Me here again,
I managed based on some advice but the advice only got me thus far as i 've tried google and this community as well but i am grateful!
What I am trying to do is draw the yellow box around the object of interest and it's challenging!
Can someone tell me why these variables are empty when the data is loading and traning without error please?
please point me in the right direction so that i can get this sorted!!
Thank you in adavance for acknowledging me!
I will upload and image of the empty variables and dispense my code!
Please see screen shot for details
Mycode:
%% Training the R-CNN detector. Training can take a few minutes to complete.
% Loading .MAT file, the ground truths and the Network layers
load('gTruth.mat')
net = alexnet
%load('gimlab.mat', 'gTruth', 'net');
rcnn = trainRCNNObjectDetector(gTruth, netTransfer, opts, 'NegativeOverlapRange', [0 0.3])
%% Testing the R-CNN detector on a test image.
img = imread('Gun00011.jpg');
[bbox, score, label] = detect(rcnn, img, 'MiniBatchSize', 32);
%% Displaying strongest detection result.
[score, idx] = max(score);
bbox = bbox(idx, :);
annotation = sprintf('%s: (Confidence = %f)', label(idx), score);
detectedImg = insertObjectAnnotation(img,'rectangle', bbox, annotation);
figure
imshow(detectedImg)

Respuestas (3)

Dinesh Yadav
Dinesh Yadav el 22 de En. de 2020

0 votos

Hi Matpar,
Even though you have loaded and trained on the data without error, the reason your bounding box shows empty is because during testing the RCNN detector is unable to find the object (matching class) in the image and hence is does not draw any bounding box(therefore, bbox is empty matrix).

4 comentarios

Matpar
Matpar el 22 de En. de 2020
Editada: Matpar el 22 de En. de 2020
Ok ok I understand what you are highlighting @ Dinesh Yadav
but the box appeared several times with the same code!! and now that i am running over and over again it is not showing!
how do i code or alleviate the challenge? please guide me step by step so that I am able to learn and possibly help someone who may in the future have the same challenge!
I am uploading the data as well i must have missed something and not seeing it!
Here is my code:
clear
clc
%% Step 1 Deploying deepNetworkDesigner to edit Layers physically
gunfolder = '/Users/mmgp/Desktop/gunsGT';
save('gunlables.mat','gunfolder');
%% Step 2 Specifying Image Amount In Specified Folder
total_images = numel(gunfolder);
%% Step 3 Accessing Content of Folder TrainingSet Using Datastore
imds = imageDatastore(gunfolder,'IncludeSubFolders',true,'LabelSource','Foldernames');
%% Step 4 Setting Output Function(images may have size variation resizing for consistency with pretrain net)
imds.ReadFcn=@(loc)imresize(imread(loc),[227,227]);
%% Step 5 Counting Images In Each Category "If not equal this will create issues"
tbl=countEachLabel(imds);
%% Step 6 Making Category The Same Number Of Images
minSetCount=min(tbl{:,2});
%% Step 7 Splitting Inputs Into Training and Testing Sets
[imdsTrain,imdsValidation] = splitEachLabel(imds,0.7,'randomized');
size(imdsTrain);
%% Step 8 Loading Pretrained Network
net = alexnet; %Trained on 1million+ images/classify images into 1000 object categories
% analyzeNetwork(net) % Display Alexnet architecture & network layer details
%% Step 9 Altering InputSize Of 1st Layer/ Alexnet Image requirements is 277 width 277 height by 3 colour channels
inputSize = net.Layers(1).InputSize;%Displays the input size of Alexnet
%% Step 9 Counting Total Number Of Images Including Subfolders **IF AVAILABLE**
imgTotal = length(imds.Files);
%% Step 10 Displaying Multiple Randomized Images Within The Dataset
% a = 4;
% b = 4;
% n = randperm(imgTotal, a*b);
%
% figure()
% Idx = 1;
% for j=1:a
% for k=1:b
% img=readimage(imds,n(Idx));
% subplot(a,b,Idx)
% imshow(img),title('Randomised Images Displayed');
% Idx=Idx+1;
% end
% end
%% Step 11 Replacing Final Layer/Last 3 Configure For 1000 classes
% Finetuning these 3 layers for new classification
% Extracting all Layers except the last 3
layersTransfer = net.Layers(1:end-3);
%% Step 12 Specifying Image Categories/Clases:
numClasses = numel(categories(imdsTrain.Labels));
Tlayers = [
layersTransfer
fullyConnectedLayer(numClasses,'WeightLearnRateFactor',25,'BiasLearnRateFactor',25);
softmaxLayer
classificationLayer];
%% Step 13 Training The Network
% Resizing images in datastore to meet Alexnet's size requirements
% Utilising Augmented Data Store for automatic resizing of training images
% Augmented Data Store Prevents Over Fitting By Randomly Flipping Along The Vertical Axis
% Stopping the network from memorizing exact details of the training data
% Also Randomly Translates them up to 30 pixels horizontally & Vertically
pixelRange = [-30 30];
imageAugmenter = imageDataAugmenter( ...
'RandXReflection',true, ...
'RandXTranslation',pixelRange, ...
'RandYTranslation',pixelRange);
augimdsTrain = augmentedImageDatastore(inputSize(1:2),imdsTrain, ...
'DataAugmentation',imageAugmenter);
% Utilising Data Augmentation For Resizing Validation Data
% implemented without specifying overfit prevention procedures
% By not specifying these procedures the system will be precise via
% predicitons
%% Step 14 Resizing Images, Assists With Preventing Overfitting
augmentedTrainingSet = augmentedImageDatastore(inputSize ,imdsTrain,'ColorPreprocessing', 'gray2rgb');
augimdsValidation = augmentedImageDatastore(inputSize,imdsValidation,'ColorPreprocessing', 'gray2rgb');
%% Step 15 Specifying Training Options
% Keep features from earlier layers of pretrained networked for transfer learning
% Specify epoch training cycle, the mini-batch size and validation data
% Validate the network for each iteration during training.
% (SGDM)groups the full dataset into disjoint mini-batches This reaches convergence faster
% as it updates the network's weight value more frequently and increases the
% computationl speed
% Implementing **WITH** The RCNN Object Detector
opts = trainingOptions('sgdm',...
'Momentum',0.9,...
'InitialLearnRate', 1e-4,...
'LearnRateSchedule', 'piecewise', ...
'LearnRateDropFactor', 0.1, ...
'Shuffle','every-epoch', ...
'LearnRateDropPeriod', 8, ...
'L2Regularization', 1e-4, ...
'MaxEpochs', 10,...
'MiniBatchSize',20,...
'Verbose', true);
[height,width,numChannels, ~] = size(imdsTrain);
imageSize = [height width numChannels];
inputLayer = imageInputLayer(imageSize);
%% Step 16 Training network Consisting Of Transferred & New Layers.
netTransfer = trainNetwork(augmentedTrainingSet,Tlayers,opts)
%% Step 17 Classifying Validation Images Utilising Fine-tuned Network
[YPred,scores] = classify(netTransfer,augimdsValidation);
%% Step 18 Displaying 4 Validation Image Samples With Predicted Labels
% idx = randperm(numel(imdsValidation.Files),4);
% figure
% for i = 1:4
% subplot(2,2,i)
% I = readimage(imdsValidation,idx(i));
% imshow(I);
% label = YPred(idx(i));
% title(string(label));
% end
%% Step 19 Calculating Validation Data Classification Accuracy (Accuracy Labels Predicted Accurately By Network)
YValidation = imdsValidation.Labels;
accuracy = mean(YPred == YValidation);
%% Step 20 Training The R-CNN detector.
% Training can take a few minutes to complete.
% Loading .MAT file, the ground truths and the Network layers
load('gTruth.mat')
% Positive and Negative Overlap Range Controls Which Image Patch is Used
rcnn = trainRCNNObjectDetector(gTruth, netTransfer, opts, 'NegativeOverlapRange', [0 0.3]);
%% Step 21 Testing the R-CNN detector on a test image.
testimg = imread('Gun00011.jpg');
[bboxes,score,label] = detect(rcnn,testimg,'MiniBatchSize',20)
%% Step 22 Display strongest detection result.
[score, idx] = max(score);
bbox = bboxes(idx, :);
annotation = sprintf('%s: (Confidence = %f)', label(idx), score);
Imgdetected = insertObjectAnnotation(testimg, 'rectangle', bbox, annotation);
figure
imshow(Imgdetected);
***************************************************************************************
************************************************************************************
My Errors:
gunfolder =
'/Users/mmgp/Desktop/gunsGT'
These are the classes
Training on single CPU.
Initializing input data normalization.
|========================================================================================|
| Epoch | Iteration | Time Elapsed | Mini-batch | Mini-batch | Base Learning |
| | | (hh:mm:ss) | Accuracy | Loss | Rate |
|========================================================================================|
| 1 | 1 | 00:00:01 | 55.00% | 1.9629 | 1.0000e-04 |
| 10 | 20 | 00:00:37 | 50.00% | 1.3121 | 1.0000e-05 |
|========================================================================================|
netTransfer =
SeriesNetwork with properties:
Layers: [25×1 nnet.cnn.layer.Layer]
*******************************************************************
Training an R-CNN Object Detector for the following object classes:
* Gun_Weapon
* Gun_Magazine
--> Extracting region proposals from 31 training images...done.
--> Training a neural network to classify objects in training data...
Training on single CPU.
Initializing input data normalization.
|========================================================================================|
| Epoch | Iteration | Time Elapsed | Mini-batch | Mini-batch | Base Learning |
| | | (hh:mm:ss) | Accuracy | Loss | Rate |
|========================================================================================|
| 1 | 1 | 00:00:01 | 30.00% | 1.5414 | 1.0000e-04 |
| 5 | 50 | 00:01:21 | 95.00% | 0.0470 | 1.0000e-04 |
| 9 | 100 | 00:02:38 | 100.00% | 0.0172 | 1.0000e-05 |
| 10 | 120 | 00:03:10 | 95.00% | 0.0673 | 1.0000e-05 |
|========================================================================================|
Network training complete.
--> Training bounding box regression models for each object class...100.00%...done.
Detector training complete.
*******************************************************************
bboxes =
0×4 empty double matrix
score =
0×1 empty single column vector
label =
0×1 empty categorical array
>>
[bboxes,scores] = detect(detector,I); always return empty boxes and scores even though the dataset is about 500 images.
you won't believe that i am currently working on this as we speak! same result and I am yet to understand the issue!
I have deleted every line and started over line by line and still the issue is challenging me for weeks now!
I would really love it if a professional can assist with this to get me to move onward!
sorry pal I would love to help you all i can do is post my code for you to see what i have done and if it makes some sense to you!
Some things may seem out of place! I am a student so please don't hold me to any thing I am trying to get this the same! forgive me!
my code thus far!
%% Extract region proposals with selective search
%% Conducting Feature Extraction With RCNN
%% Classifing Features With SVM
%% Improving The Bounding Box
clc
clearvars
clear
close all
%% Step 1 Creating Filenames /Loading Data
% anet = alexnet
load('Wgtruth.mat');
load('anet.mat');
save Wgtruth.mat Wgtruth;
save rcnnGuns.mat;
save anet.mat anet;
load('rcnnGuns.mat', 'Wgtruth', 'anet');
%% Step 2 Highlighting Image Input Size
inputSize = anet.Layers(1).InputSize;
anet.Layers;
total_images = size(Wgtruth,1);
%% Step 3 Adding Image Directory For Path To Image Data
imDir = '/Users/mmgp/Documents/MATLAB/2020/RCNN/Wgtruth';
addpath(imDir);
% imDir = fullfile(matlabroot, 'toolbox', 'vision', 'visiondata','Wgtruth');
% addpath(imDir);
%% Step 4 Accessing Contents Of Folder TrainingSet Using Datastore
imds =imageDatastore(imDir,'IncludeSubFolders',true,'LabelSource','Foldernames');
%% Step 5 Splitting Inputs Into Training and Testing Sets
[imdsTrain,imdsValidation] = splitEachLabel(imds,0.7,'randomized');
%% Step 6 Replacing Final Layer/Last 3 Configure For Network classes
% Complex Architecture Layers Has Inputs/Outputs From Multiple Layers
% Finetuning These 3 Layers For New Classification
% Extracting All Layers Except The Last 3
layersTransfer = anet.Layers(1:end-3)
%% Step 7 Specifying Image Categories/Clases From 1000 to Gun(One Class):
numClasses = numel(categories(imdsTrain.Labels));
Tlayers = [
layersTransfer
fullyConnectedLayer(1,'Name','fc8','WeightLearnRateFactor',10,'BiasLearnRateFactor',10);
softmaxLayer('name', 'Softmax')
classificationLayer('Name','ClassfLay')]
%% Step 8 Displaying and Visualising Layer Features Of FC8
% layer(16) = maxPooling2dLayer(5,'stride',2)
% disp(Tlayers)
% layer = 22;
% channels = 1:30;
% I = deepDreamImage(net,layer,channels,'PyramidLevels',1);
% figure
% I = imtile(I,'ThumbnailSize',[64 64]);
% imshow(I)
% name = net.Layers(layer).Name;
% title(['Layer ',name,' Features'])
%% Warp Image & Pixel Labels
% Creates A Randomized 2-D Affine Transformation From A Combination Of Rotation,
% Translation, Scaling (Resizing), Reflection, And Shearing
% Rotate Input Properties By An Angle Selected Randomly From Range [-50,50] Degrees.
%% Step 9 Setting Output Function(images may have size variation resizing for consistency with pretrain net)
pixelRange = [-70 70]
imageAugmenter = imageDataAugmenter('RandRotation',[-70 70],...
'RandXReflection',true,...
'RandYReflection',true,...
'RandXShear',pixelRange,...
'RandYShear',pixelRange,...
'RandXTranslation',pixelRange, ...
'RandYTranslation',pixelRange),...
augimdsTrain = augmentedImageDatastore(inputSize(1:2),imdsTrain, ...
'DataAugmentation',imageAugmenter);
%% Step 10 Resizing Images, Assists With Preventing Overfitting
% Utilising Data Augmentation For Resizing Validation Data
% Implemented Without Specifying Overfit Prevention Procedures
% By Not Specifying These Procedures The System Will Be Precise Via
% Predicitons Data Augmentation Prevent The Network From
% Overfitting/ MemorizingExact Details Of Training Images
augmentedTrainingSet = augmentedImageDatastore(inputSize ,imdsTrain,'ColorPreprocessing', 'gray2rgb')
augimdsValidation = augmentedImageDatastore(inputSize,imdsValidation,'ColorPreprocessing', 'gray2rgb')
%% Step 11 Specifying Training Options
% Keep features from earlier layers of pretrained networked for transfer learning
% Specify epoch training cycle, the mini-batch size and validation data
% Validate the network for each iteration during training.
% (SGDM)groups the full dataset into disjoint mini-batches This reaches convergence faster
% as it updates the network's weight value more frequently and increases the
% computationl speed
% Implementing **WITH** The RCNN Object Detector
options = trainingOptions('sgdm',...
'Momentum',0.9,...
'InitialLearnRate', 1e-4,...
'LearnRateSchedule', 'piecewise', ...
'LearnRateDropFactor', 0.1, ...
'Shuffle','every-epoch', ...
'LearnRateDropPeriod', 8, ...
'L2Regularization', 1e-4, ...
'MaxEpochs', 10,...
'MiniBatchSize',80,...
'Verbose', true)
%% Step 12 Training network Consisting Of Transferred & New Layers.
netTransfer = trainNetwork(augmentedTrainingSet,Tlayers,options)
rcnn = trainRCNNObjectDetector(Wgtruth, netTransfer, options, 'NegativeOverlapRange', [0 0.3]);
save('rcnn.mat', 'rcnn')
%% Step 13 Testing R-CNN Detector On Test Image.
img = imread('11.jpg');
[bbox, score, label] = detect(rcnn, img, 'MiniBatchSize', 80)
numObservations = 4;
images = repelem({img},numObservations,1);
bboxes = repelem({bbox},numObservations,1);
labels = repelem({label},numObservations,1);
%% Step 14 Displaying Strongest Detection Results.
[score, idx] = max(score)
bbox = bbox(idx, :)
annotation = sprintf('%s: (Confidence = %f)', label(idx), score)
detectedImg = insertObjectAnnotation(img, 'rectangle', bbox, annotation);
figure
imshow(detectedImg)
%
Matpar
Matpar el 10 de Feb. de 2020
Editada: Matpar el 10 de Feb. de 2020
what i do is go to the workspace with the same link you provided and observe what is being produced when the code is in progress!
ohhhhhh! by the way Dinesh Yadav provided some insight but this same code produced the result perfectly but when i try another image is did not detect anything and when I swapped back Unknown.pngthe image that had the perfect result, the bounding box was empty and this has been the case since then!
I have been searching for a method to get this accurate but this is ongoing and the challenge is still with me!
if you have another link i would gladly take a look at it to see if i can move forward but thus far this is where i have gotten too!

Iniciar sesión para comentar.

France
France el 19 de Mzo. de 2020

0 votos

Dear Matpar,
I'm in the same situation. have you understood the problem so far?
thank you!
Matpar
Matpar el 19 de Mzo. de 2020
Editada: Matpar el 19 de Mzo. de 2020

0 votos

Hey France ,
Yes I manage to the solve this by myself and i am so proud, thus far it was frustrating but I got through with this!
  1. ensure your labling is completed properley this is what is causing the issue. if the system has nothing to compare the positive region of interest with guess what? it will not present the bounding box and the results you are looking for! unfortunately in the examples loads of details are missing and well by trial and error you have to figure this out, or if a kind soul decides to assist you!
  2. Specify the negative blob in the labelling of the same image or point the system to a folder of images similar in gradient intesities and highlight the negative region of interest, (ie) everything that is not the object that you are trying to find!
  3. it is easier to specify the negative blog in the same image
  4. ensure that blob is small so that you can select the negative area as well i will attach an image check it out
  5. the region of the positive blog must be that same size with the negative blob on the same image
  6. if you are confuse that is normal check the image and you will see what i am talking about!
  7. export the labelling as a table to workspace not
  8. then the code above will work
  9. the labelling must be done for every image in the data set and yes it is tedious I suggest playing somemmusic whilst you do this!

1 comentario

France
France el 19 de Mzo. de 2020
Thanks a lot! I will try with all your suggestions!!

Iniciar sesión para comentar.

Preguntada:

el 19 de En. de 2020

Comentada:

el 19 de Mzo. de 2020

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by