Borrar filtros
Borrar filtros

Deep learning numerical regression, no images, custom loss function

10 visualizaciones (últimos 30 días)
帅逼 周
帅逼 周 el 20 de Jun. de 2024
Editada: Matt J el 27 de Jun. de 2024 a las 20:52
I want to define a neural network or deep learning. Firstly, I have [500 * 4] data with a sample size of 500, each with 4 features (x1, x2, x3, x4).
The output variables are y1 and y2 ([500 * 2]), but I don't have any output data, I only have their range of values (such as y1 in range (0-1)).
I have the variable z, which is the measured data, z=5 * e ^ (y1)+7 * sin (y2)
The loss function will be defined as : z(measure) - z (y1, y2)
The purpose of this neural network is to estimate y1 and y2 based on x1, x2, x3, x4.
For instance:
I know information about 500 cats, which are: x1 (height), x2 (weight), x3 (food intake), x4 (excretion).
I also know the age of these 500 cats: z
Now, I want to estimate y1 (cancer probability) and y2 (hair loss). The range of y1 is 0-1, and the range of y2 is -10 to 10
Do you know how to establish such deep learning or neural networks? Is there a simple example?
  2 comentarios
Matt J
Matt J el 20 de Jun. de 2024
I have mild doubts about whether decoding two hidden variables (y1,y2) from only a single observed variable (z) is a well-posed regression problem.
帅逼 周
帅逼 周 el 24 de Jun. de 2024 a las 2:15
Of course, z is not unique, as each sample has multiple values for z_i(y1,y2,h_i). I still have a problem, which is that there is no limit to the range of y, resulting in unreasonable values for y, but the loss is very small.
Thank you very much for your answer, it has been very helpful to me.

Iniciar sesión para comentar.

Respuestas (1)

Matt J
Matt J el 20 de Jun. de 2024
Editada: Matt J el 20 de Jun. de 2024
I don't know what kind of hidden layer architecture you would want for such an application, but the network below (layer graph attached) is a possible starting point. Note that the final concatenationLayer, which combines y1 and y2 into a single array, is optional. You could just have a network with two separate outputs.
load layer_graph
Xdata=rand(4,500);
Ydata=rand(2,500);
Zdata=5 * exp(Ydata(1,:)) + 7 * sin(Ydata(2,:));
loss=@(Y,T) mean( abs( 5 * exp(Y(1,:)) + 7 * sin(Y(2,:)) -T ) );
net=trainnet(Xdata',Zdata',dlnetwork(lgraph),...
loss,trainingOptions('adam','MaxEpochs',500))
Iteration Epoch TimeElapsed LearnRate TrainingLoss _________ _____ ___________ _________ ____________ 1 1 00:00:00 0.001 6.0419 50 17 00:00:01 0.001 4.0253 100 34 00:00:02 0.001 4.5974 150 50 00:00:02 0.001 4.1565 200 67 00:00:03 0.001 3.6452 250 84 00:00:03 0.001 4.2532 300 100 00:00:04 0.001 3.835 350 117 00:00:04 0.001 3.313 400 134 00:00:05 0.001 3.8312 450 150 00:00:05 0.001 3.2277 500 167 00:00:06 0.001 2.8941 550 184 00:00:07 0.001 3.2641 600 200 00:00:07 0.001 2.6461 650 217 00:00:08 0.001 2.4459 700 234 00:00:08 0.001 2.9079 750 250 00:00:09 0.001 2.4718 800 267 00:00:09 0.001 2.3679 850 284 00:00:10 0.001 2.7627 900 300 00:00:10 0.001 2.4201 950 317 00:00:11 0.001 2.3235 1000 334 00:00:11 0.001 2.7362 1050 350 00:00:12 0.001 2.4036 1100 367 00:00:12 0.001 2.3354 1150 384 00:00:13 0.001 2.6991 1200 400 00:00:13 0.001 2.398 1250 417 00:00:14 0.001 2.3482 1300 434 00:00:14 0.001 2.6824 1350 450 00:00:15 0.001 2.3949 1400 467 00:00:15 0.001 2.3518 1450 484 00:00:16 0.001 2.6808 1500 500 00:00:17 0.001 2.3937 Training stopped: Max epochs completed
net =
dlnetwork with properties: Layers: [7x1 nnet.cnn.layer.Layer] Connections: [7x2 table] Learnables: [4x3 table] State: [0x3 table] InputNames: {'featureinput'} OutputNames: {'Y'} Initialized: 1 View summary with summary.
  2 comentarios
帅逼 周
帅逼 周 el 24 de Jun. de 2024 a las 2:14
Of course, z is not unique, as each sample has multiple values for z_i(y1,y2,h_i). I still have a problem, which is that there is no limit to the range of y, resulting in unreasonable values for y, but the loss is very small.
Thank you very much for your answer, it has been very helpful to me.
Matt J
Matt J el 26 de Jun. de 2024 a las 2:26
Editada: Matt J el 27 de Jun. de 2024 a las 20:52
I still have a problem, which is that there is no limit to the range of y
You should be seeing y in the appropriate range with the general architecture I gave you. The tanh and clippedRelu layers are implicitly bounded,
Thank you very much for your answer, it has been very helpful to me.
You're welcome, but please Accept-click the asnwer if it solves your problem

Iniciar sesión para comentar.

Categorías

Más información sobre Image Data Workflows en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by