Load a pretrained neural network object in rlNeuralNetworkEnvironment
4 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Vasu Sharma
el 21 de Nov. de 2023
Respondida: Emmanouil Tzorakoleftherakis
el 21 de Dic. de 2023
Hi,
I want to train an RL MBPO Agent that samples from a model. The model is a trained DL object, trained in matlab. I am wondering how I can load its weights inside the env object. The examples for rlNeuralNetworkEnvironment can be used to define a network structure but I would like to add my weights to this?
Best Regards,
Vasu
0 comentarios
Respuestas (1)
Emmanouil Tzorakoleftherakis
el 21 de Dic. de 2023
Hi Vasu,
You can use a pretrained environment model with MBPO agent as follows:
1) Create a rlContinuousDeterministicTransitionFunction with the trained dlnet if it is deterministic or rlContinuousGaussianTransitionFunction if it is stochastic (mean heads and std heads).
2) After that, you need to create rlNeuralNetworkEnvironment with newly defined function from 1.
3) Create MBPO agent.
4) Set LearnRate = 0 in TransitionOptimizerOptions in rlMBPOAgentOptions to avoid updating the models during training.
Hope this helps
0 comentarios
Ver también
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!