Why is it that when I add an agent to simulink, I get an error indicating that I cannot change the properties
5 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
When I open an official case, such as by comparing pid and ddpg to control the height of water in the tank, the official original file can be run. But when I will remove the agent of the official case and manually add an "agent" myself, after clicking "Run" in the mlx file, the following error will be reported


ill usage rl.train.SeriesTrainer/run
'RL_dianyesifu_DDPG_test/RL Agent' 中出错: 无法计算封装初始化命令。
error rl.train.TrainingManager/train (第 479 行)
run(trainer);
erro rl.train.TrainingManager/run (第 233 行)
train(this);
erro rl.agent.AbstractAgent/train (第 136 行)
trainingResult = run(trainMgr,checkpoint);
reason:
ill usage rl.env.internal.reportSimulinkSimError
This cannot be changed while the simulation is running 'rlwatertank/RL Agent/RL Agent' the stats of 'Tunable'
I am a beginner, please give me some advice, thank you!
2 comentarios
Emmanouil Tzorakoleftherakis
el 9 de En. de 2024
Did you update the agent variable everywhere? Make sure to update it on the RL Agent block in the Simulink model as well
Mxolisi
el 8 de Oct. de 2024
Hi Emmanouil, I am having the same problem. How do we update these variables?
Respuestas (1)
Harsh
el 21 de Mzo. de 2025
Assuming you are referring to the “mlx” file available with the official example from the following documentation page -
To use your own agent in the above example you can load it programmatically from a “mat” file and then modify the “sim” function to use that agent. Below are the changes that can be used to use a different agent than the one given in example.
In the “Validate Trained Agent” section –
rng(1)
loadAgentData = load("myAgent.mat");
myAgent = loadAgentData.myAgent;
Then use this “myAgent” with the “sim” function –
experiences = sim(env,myAgent,simOpts);
Also modify the “RL Agent” block in the model by double clicking on it and then enter your agent in the “Agent object” field. Note that you need to first load the agent in base workspace to use it in the “RL Agent” block. Below is the snapshot of “Block Parameters” window for “RL Agent” block –

If you want to understand why you got an error, please share the changes that you made to the "mlx" file before running it.
0 comentarios
Ver también
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!