Borrar filtros
Borrar filtros

Multiple-input, single-output

6 visualizaciones (últimos 30 días)
Jesus Mª Juarez Ferreras
Jesus Mª Juarez Ferreras el 21 de Mayo de 2024
Editada: Jesus Mª Juarez Ferreras el 27 de Mayo de 2024
Tengo un sistema MISO (plantmiso.mdl) de dos entradas y una salida, y no consigo su entrenamiento.
Cada vector de entrada tiene 100 números aleatorios del 1 al 100, al igual que el de salida. Como solo tengo un vector de salida he probado de varias maneras con el otro vector que tengo que incluir para que
system = trainNetwork(in,out,Networklayers,options);
no muestre error.
He puesto 100 ceros al principio o al final, 100 unos, he hallado la media y he puesto 100 valores con este número, etc.
¿Alguien sabe cómo podría entrenar dicho sistema con Matlab?
Gracias
I have a two-in, one-out MISO system (plantmiso.mdl), and I can't get it trained.
Each input vector has 100 random numbers from 1 to 100, just like the output. Since I only have one output vector, I have tried several ways with the other vector that I have to include so that
system = trainNetwork(in,out,Networklayers,options);
do not show error.
I have put 100 zeros at the beginning or end, 100 ones, I have found the average and I have put 100 values ​​with this number, etc.
Does anyone know how I could train such a system with Matlab?
Thank you

Respuesta aceptada

Malay Agarwal
Malay Agarwal el 22 de Mayo de 2024
I understand that you want to train a multi-input, single-output network using the “trainNetwork” function.
You can achieve this by combining the two input arrays and the output array into a single datastore and passing the combined datastore to the “trainNetwork” function.
Arrays can be converted to datastores using the “arrayDatastore” function. Each of the “arrayDatastore” objects can then be combined into a single datastore using the “combine” function. For example, if your input arrays are X1 and X2 and the output array is Y, you can try this:
X1Datastore = arrayDatastore(X1);
X2Datastore = arrayDatastore(X2);
labelDatastore = arrayDatastore(Y);
trainDatastore = combine(X1Datastore, X2Datastore, labelDatastore);
Once you have the combined datastore, you can pass it to the “trainNetwork” function as follows:
net = trainNetwork(trainDatastore, Networklayers,options);
You can also refer to the following link for a complete example on how to train multi-input networks: https://www.mathworks.com/help/deeplearning/ug/train-network-on-image-and-feature-data.html.
Please note that the example uses the “trainnet” function instead of the “trainNetwork” function since, starting with MATLAB R2024a, the “trainNetwork” function is not recommended. The “trainnet” function has certain advantages over “trainNetwork” which are mentioned here: https://www.mathworks.com/help/deeplearning/ref/trainnetwork.html#mw_868305a5-132e-4203-9214-860584bdcfdd.
Please refer to the following resources for further information:
Hope this helps!
  3 comentarios
Jesus Mª Juarez Ferreras
Jesus Mª Juarez Ferreras el 26 de Mayo de 2024
Editada: Jesus Mª Juarez Ferreras el 27 de Mayo de 2024
Hola, el siguiente código arroja el siguiente error:
Hello, the following code throws the following error:
clc
clear
close all
A1 = 0.04;
A4 = 0.04;
B1 = 1.5e-4;
B4 = 1.5e-4;
Kp1 = 3.7E-6;
Kp4 = 3.7E-6;
g = 9.8;
v=[12.6101 1.0688 31.0125 37.5489 12.9138 71.4793 95.7939 32.9065 79.1094 67.4771 11.8518 72.6723 88.3776 1.0488 11.0924 37.6680 91.5542 99.8999 17.8789 18.0891 11.2040 61.0086 41.7771 26.3225 20.9668 19.5812 88.4935 63.5100 62.6202 66.1686 18.5314 62.9590 87.7691 18.6421 64.9118 88.5281 25.8011 59.0359 49.2004 84.0695 6.1159 74.2078 55.8971 43.5178 78.1925 40.2701 36.3657 66.2878 40.0230 18.6693 6.3752 25.4650 90.2795 98.2619 48.9268 15.3302 51.6550 99.3592 76.4943 96.6364 93.0582 24.4126 10.6547 41.5677 13.9861 75.7021 35.4622 45.5286 20.7342 73.9153 21.1334 77.0479 24.8867 61.7596 35.2314 72.5405 12.6051 18.8981 20.1926 36.6943 41.0350 23.3522 67.2367 95.7641 43.7348 80.5776 80.0207 99.5289 21.9686 33.5729 76.4718 75.9192 96.1664 57.4193 61.2421 37.1441 35.0112 60.3988 53.0373 49.0117];
p=[65.9094 4.7869 19.8024 92.9458 58.7150 2.5256 30.0739 5.4808 79.6784 71.4252 54.0998 59.0927 21.8838 41.9776 52.0662 83.9952 66.3312 48.6364 85.4237 48.6713 27.9573 73.3884 96.9775 38.1866 24.6214 25.0278 2.8750 98.2827 80.4593 56.7850 39.4941 65.6003 99.3771 71.6628 11.1343 45.2816 69.5867 33.9626 92.2531 63.6166 93.2864 43.1661 30.7485 89.0138 2.7200 15.6575 95.9571 71.7325 31.3402 82.9815 80.9813 91.0163 64.6385 63.2399 12.7286 91.9783 62.7741 26.5009 95.1791 5.5744 3.0718 9.7170 52.2664 87.9877 40.8157 56.5710 60.2174 8.4300 39.7575 36.1435 25.2341 11.6948 12.1166 27.4533 27.3728 93.6720 19.4315 51.2361 15.6118 92.1493 93.0167 14.5384 87.2854 2.2232 72.4816 87.9799 39.3572 25.3966 12.0606 84.8313 66.3062 78.1866 36.4977 19.2294 2.0078 9.5056 33.6173 30.7822 51.1505 39.2739];
sim_in = [v, p];
sim_out = [7.1311 6.3588 6.0545 7.8257 9.1521 8.0623 7.0667 6.2499 6.0917 6.2498 7.5273 7.2557 6.3734 7.3573 8.4913 9.8310 8.9261 7.8193 10.0257 10.6941 10.6282 10.7414 12.1362 12.1442 11.7505 11.4088 10.1702 10.6450 10.8248 10.4919 10.7776 10.6827 10.0315 11.5522 10.4712 9.5101 10.7533 10.1794 11.1625 10.3909 13.2449 12.4087 11.7282 12.7822 11.4824 10.6570 12.2692 12.0873 11.6660 13.5195 15.6611 17.3295 16.0747 14.6372 13.5176 15.7546 15.6847 14.2223 13.8570 12.4838 11.1820 10.2803 11.2498 12.3825 12.6999 12.0127 12.5235 11.4221 11.6250 10.7953 10.4960 9.4175 8.6992 8.0898 7.8539 8.0031 7.7384 8.6275 8.1134 9.7540 11.1313 10.4081 10.5331 9.3326 10.0847 9.6961 8.9038 7.8016 7.2000 8.8101 8.4298 8.2227 7.2229 6.6039 5.6859 5.0770 5.2510 4.9641 5.2480 5.3237];
v_Datastore = arrayDatastore(v);
p_Datastore = arrayDatastore(p);
out_Datastore=arrayDatastore(sim_out);
trainDatastore = combine(v_Datastore, p_Datastore, out_Datastore);
% ---------------CREATE END TRAIN NETWORK--------------------------
% Network architecture
numResponses = 1;
featureDimension = 1;
numHiddenUnits = 400;
maxEpochs = 400;
miniBatchSize = 300;
Networklayers = [sequenceInputLayer(featureDimension) ...
lstmLayer(numHiddenUnits) ...
dropoutLayer(0.02),...
fullyConnectedLayer(numResponses) ...
regressionLayer
];
options = trainingOptions('adam', ...
'MaxEpochs',maxEpochs, ...
'MiniBatchSize',miniBatchSize, ...
'GradientThreshold',20, ...
'Shuffle','once', ...
'Plots','training-progress',...
'LearnRateSchedule','piecewise',...
'LearnRateDropPeriod',200,...
'L2Regularization',1e-3,...
'LearnRateDropFactor',0.5,...
'Verbose',0,...
'ValidationData',[{sim_in} {sim_out}]);
% ENTRENAMIENTO
net = trainnet(trainDatastore, Networklayers,options);
Falta un argumento en la llamada a la función pero no se cuál poner. Gracias.
There is a missing argument in the function call but I don't know which one to put. Thank you.
Jesus Mª Juarez Ferreras
Jesus Mª Juarez Ferreras el 26 de Mayo de 2024
Editada: Jesus Mª Juarez Ferreras el 27 de Mayo de 2024
Si pongo trainNetwork el error es:
If I put trainNetwork the error is:
Error using trainNetwork (line 191)
Invalid validation data. Sequence responses must have the same sequence length as the
corresponding predictors.
Error in miso_directo (line 52)
net = trainNetwork(trainDatastore,Networklayers,options);
Tengo dos entradas de 100 elementos y una salida de 100 elementos.
I have two inputs of 100 elements and one output of 100 elements.

Iniciar sesión para comentar.

Más respuestas (0)

Categorías

Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by