How can I set the initial value of action space while using Simulink DDPG Agent?
28 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
lei wang
el 20 de Nov. de 2024 a las 14:48
Respondida: Shlok
el 29 de Nov. de 2024 a las 6:52
I got a robot model in Simulink, now I want to train the robot using DDPG Agent.
My question is, how can I set the initial value of action space? I want the action starts from some specefic value such as zero.
0 comentarios
Respuestas (1)
Shlok
el 29 de Nov. de 2024 a las 6:52
Hi Lei,
DDPG agents are designed to operate in continuous action spaces. Hence, to create a custom continuous action space for the DDPG agent, you can use the “rlNumericSpec” function. “rlNumericSpec” helps in creating specifications object for a numeric action or observation channel. By setting the “LowerLimit” to zero, you can specify that the action starts from there.
Here is a sample code:
actionInfo = rlNumericSpec([1,1], 'LowerLimit', 0, 'UpperLimit', 1);
observationInfo = ... // define observation specifications
agtInitOpts = ... // define agent options
agent = rlDDPGAgent(observationInfo,actionInfo,agtInitOpts);
To know more about “rlNumericSpec” function, you can refer to the following MATLAB Answer and documentations:
0 comentarios
Ver también
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!