How can I use RL Agent with PX4 Host Target?

5 visualizaciones (últimos 30 días)
Unmanned Aerial and Space Systems
Unmanned Aerial and Space Systems el 17 de Mayo de 2022
Respondida: Ankur Bose el 24 de En. de 2023
Hi, I have a question related with Pixhawk and Reinforcement Learning Toolbox. I wanna use PX4 Host Target to implement RL training algorithm before deployin the RL algorithm into the real Pixhawk board. But, when I started training MATLAB shows me that following statements. How can I use RL Agent and PX4 Host Target together?
-----------------------------------------------------------------------------------------------------------------------
Warning: The px4.internal.block.Subscriber System object has private or protected properties, but does not implement both
the saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
Warning: The px4.internal.block.Subscriber System object has private or protected properties, but does not implement both
the saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
Warning: The px4.internal.block.PWM System object has private or protected properties, but does not implement both the
saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
Warning: The px4.internal.block.Subscriber System object has private or protected properties, but does not implement both
the saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
Warning: The px4.internal.block.Publisher System object has private or protected properties, but does not implement both the
saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
-----------------------------------------------------------------------------------------------------------------------
  5 comentarios
Ankur Bose
Ankur Bose el 20 de Mayo de 2022
I dont think these warnings are responsible for RL issue you are facing. I suggest reach out to MathWorks Tech Suport https://www.mathworks.com/support/contact_us.html
Unmanned Aerial and Space Systems
Unmanned Aerial and Space Systems el 20 de Mayo de 2022
Editada: Unmanned Aerial and Space Systems el 20 de Mayo de 2022
I have been trying to figure out this problem for three weeks, but I didn't get any feedback from contact mail.

Iniciar sesión para comentar.

Respuestas (1)

Ankur Bose
Ankur Bose el 24 de En. de 2023
Manually closing this question as user has been recommended to reach out to MathWorks Tech Support

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by