I am getting this error when I try to train a TD3 RL agent.
Thanking You
Apoorv Pandey

1 comentario

Emmanouil Tzorakoleftherakis
Emmanouil Tzorakoleftherakis el 24 de Mzo. de 2023
If you share a reproduction model it would be easier to debug

Iniciar sesión para comentar.

 Respuesta aceptada

Cris LaPierre
Cris LaPierre el 24 de Mzo. de 2023

0 votos

When defining your rlQValueFunction, include the ActionInputNames and OvservationInputNames name-value pairs.
% Observation path layers
obsPath = [featureInputLayer( ...
prod(obsInfo.Dimension), ...
Name="netObsInput")
fullyConnectedLayer(16)
reluLayer
fullyConnectedLayer(5,Name="obsout")];
% Action path layers
actPath = [featureInputLayer( ...
prod(actInfo.Dimension), ...
Name="netActInput")
fullyConnectedLayer(16)
reluLayer
fullyConnectedLayer(5,Name="actout")];
%<snip>
critic = rlQValueFunction(net,...
obsInfo,actInfo, ...
ObservationInputNames="netObsInput",...
ActionInputNames="netActInput")

2 comentarios

Apoorv Pandey
Apoorv Pandey el 27 de Mzo. de 2023
I have used the exact same code as mentioned in the link and still getting the error. Please help
Cris LaPierre
Cris LaPierre el 27 de Mzo. de 2023
Please share your data and your code. You can attach files using the paperclip icon. If it's easier,save your workspace variables to a mat file and attach that.

Iniciar sesión para comentar.

Más respuestas (0)

Preguntada:

el 24 de Mzo. de 2023

Comentada:

el 27 de Mzo. de 2023

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by