Load data into experience buffer: DDPG agent

31 views (last 30 days)
Daksh Shukla
Daksh Shukla on 23 Feb 2020
Commented: Arman Ali on 1 Aug 2022
I am using RL toolbox version 1.1 with Matlab R2019b and using the DDPG agent to design a controller. Is there a way to load in data (state, action, reward, next state) collected from real experiments into the experience buffer before startting training?

Answers (2)

JiaZheng Yan
JiaZheng Yan on 31 Mar 2020
I find a way to show the Memory of the experience buffer.
You can open the file "ExperienceBuffer.m", which is in "...\Matlab\toolbox\rl\rl\+rl\+util".
In this file, you can the property value of the variable Memory. For example:
Then you set:
agentOpts.SaveExperienceBufferWithAgent = true;
agentOpts.ResetExperienceBufferBeforeTraining = false;
After your training, you can get the data in ''agent.ExperienceBuffer.Memory''
This also means that you can modify and use the training data.
I hope this method works for you : )
Arman Ali
Arman Ali on 1 Aug 2022
have you found the answer? if yes please guide?

Sign in to comment.

Priyanshu Mishra
Priyanshu Mishra on 26 Feb 2020
Hi Daksh,
You may find following link useful for your answer.
  1 Comment
Daksh Shukla
Daksh Shukla on 26 Feb 2020
Hello Priyanshu,
Thanks for your response.
However, the link does not exactly resolve the problem I am having. The link talks about running a lot of initial simulations and saving the agent with the experience buffer. But, what I would like to do is use data from "real experiments" and NOT simulations. I would like to add this data to the experience buffer or the replay memory to kick start the DDPG learning.
Based on all my reading and trying to access experience buffer in Matlab, it seems like experience buffer object is a hidden property and I cannot upload data to it directly from an external source.
I would really appreciate if you could let me know a direct way to upload data to the experience buffer, if there is one.

Sign in to comment.




Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by