Deep Neural Network - Validation accuracy unchanged

5 visualizaciones (últimos 30 días)
Martin Kovac
Martin Kovac el 21 de Mayo de 2024
Respondida: Shivansh el 28 de Jun. de 2024
I would like to ask you for a help or advice. I try to construct and learn deep learning neural network (DLNN) in area of stock market in Matlab. My goal is to forecast next day price movement based on five time-series sequences (open, close, high, low price and volume). According to my best knowledge the best type of DLNN for such problem is using Long short-term memory (LSTM) layer or Bidirectional long short-term memory (BiLSTM) for recurrent neural network (RNN). I create the DLNN with following layers structure, train, validation and test data in ratios of 80%:10%:10%.
Input dataset consist of 365 days price movement observations (from 2014 until now) charecterized by five time-series sequences (open, close, high, low price and volume) and associated label (UP or DOWN) corresponding to the close price in the next day. In this way (80%:10%:10%) it is possible to create 1721 train, 215 validation and 216 test data groups each in 365x5 dimmensions. I try to used trainnet function with adam solver. However to my surprise, I always get the exactly the same value of Validation accuracy result about 50 what I interpret to myself as some king of systematic error. I get (see picture)
1. oscillating around 50 (if InitialLearnRate is high)
2. stay near up or down around 50 (if InitialLearnRate is low)
From my point of view I suppose that I do someting wrong but really dont know where to look for the error. For this reason, I appeal to you, more experienced community for a help. The code is below and I also attach the source code together with saved workspace (consisting of dln associated DLNN generated from deepNetworkDesigner), if anyone has interest. Please here is the google drive link (external link because size is greater than allowed 5MB): DLN - market.zip
Thank you for any advice.
CODE:
%% Start Script
load('./Workspace/dln.mat');
filename = './Training data/HistoricalData_1715268339888.csv';
seriesLength =365;
%% Data import, format, create datasets
dataImport = readmatrix(filename); % data import
dataFlip = flipud(dataImport); % data flipping, older first
samplesSize = size(dataFlip,1)-seriesLength+1; % size (amount) of input samples
% create dataset:
%*************************
data = cell(samplesSize,1);
for i=1:samplesSize
data{i,1} = dataFlip(i:(seriesLength-1+i),2:end); % 2:end), end change to 2
end
% create labels
%*************************
priceMovement = cell(samplesSize-1,1);
for j=1:(samplesSize-1)
if data{j}(end,1)<data{j+1}(end,1)
priceMovement{j} = 'UP';
else
priceMovement{j} = 'DOWN';
end
end
data = data(1:end-1); % last day has no any forecast, must be removed
labels = categorical(priceMovement); % two categories of price movement forecast: 1.up 2.down
classNames = categories(labels);
paramsCount = size(data{1},2);
%% Show datasets (first four dataset group)
figure
tiledlayout(2,2)
for i = 1:4
nexttile
stackedplot(data{i},DisplayLabels="Parameter "+string(1:paramsCount))
xlabel("Order [day]")
title("Class: " + string(labels(i)))
end
%% Partition the data into a training validation and test sets
numObservations = numel(data);
[idxTrain,idxValidation,idxTest] = trainingPartitions(numObservations,[0.8 0.1 0.1]);
% separate Train Data
%*************************
XTrain = data(idxTrain);
TTrain = labels(idxTrain);
% separate Validation Data
%*************************
XValidation = data(idxValidation);
TValidation = labels(idxValidation);
% separate Test Data
%*************************
XTest = data(idxTest);
TTest = labels(idxTest);
%% Build Deep Learning Neural Network by 'deepNetworkDesigner' according to example in URL
% deepNetworkDesigner;
%% Specify the training options.
options = trainingOptions("adam", ... % adam
MaxEpochs=2000,...
InitialLearnRate=0.0005, ... % InitialLearnRate=0.0005, ..
GradientThreshold=5, ... % GradientThreshold=10, ...
Shuffle = "every-epoch", ... % Shuffle = "every-epoch", ...
ValidationData={XValidation,TValidation},...
Plots="training-progress", ...
OutputNetwork= 'best-validation',...
Acceleration= "auto",...
Verbose=false,...
ValidationFrequency=5,...
Metrics=("accuracy"));
%% Train Neural Network
[netNN,netNNinfo] = trainnet(XTrain,TTrain,dln,"binary-crossentropy",options); % this clasification issue belongs to binary problem, therefore used "binary-crossentropy" instead of "crossentropy"
%% Show Trained Neural Network
show(netNNinfo);
%% Test Neural Network
netScores = minibatchpredict(netNN,XTest); % insert test samples into trained network
YTest = scores2label(netScores,classNames); % conversion scores to labels
netAcc = mean(YTest == TTest)*100; % in [%], classification accuracy calculation
%% Visualize the predictions in a confusion chart
figure
confusionchart(TTest,YTest)
show

Respuestas (1)

Shivansh
Shivansh el 28 de Jun. de 2024
Hi Martin,
It seems like you are facing issues with your Deep learning model where validation accuracy doesn't improve and hovers around 50%. It is a common issue with DL models and in your case, it essentailly means that the model is just guessing and didn't train properly.
Your network architecture seems well constructed to me. I will still suggest to experiment with LSTM first and then increasing the complexity of the model.
The data can also be a reason for improper training. Try normalizing the data as the stock market data can have different scaling. You can also check the percentage of labels in the training data. In case, they are not balanced, consider using techniques like oversampling or undersampling to balance the label class.
There can be also a case of overfitting on the training data. You can introduce regularization in your model and experiment with dropout rates to tackle overfitting.
I wanted to try a few things with your model but the link to the data is not working due to access restriction.
I hope the above suggestions will help you get better results with your model.

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by