Would I able to load a Gazebo world as a training model into rlSimulinkEnv function ?
    4 visualizaciones (últimos 30 días)
  
       Mostrar comentarios más antiguos
    
    Jen-Yu Lee
 el 18 de Feb. de 2021
  
    
    
    
    
    Comentada: Vishal Jadhav
 el 11 de Jul. de 2022
            Hi, I'm trying to use Simulink to apply turtlebot3 to do deep reinforcement learning simulation, the tutorial I'm following is from Robotics Arena DRL for Walking Robots. 
I face a problem is that I want to load the Gazebo model such like turtlebot3_world.world instead of the walking robot model, but I didn't find any tutorial about it. 
Here is the code at the file createWalkingAgent2D.m
% Environment
mdl = 'walkingRobotRL2D';
load_system(mdl);
blk = [mdl,'/RL Agent'];
env = rlSimulinkEnv(mdl,blk,observationInfo,actionInfo);
If there is anyone knows it, please give me some advicess. Thank you!!
1 comentario
  Vishal Jadhav
 el 11 de Jul. de 2022
				Hello Jen,
I am trying to use Matlab script to apply TB3 to deep reinforcemnet learning simulation. Can you share how you defined custom gazebo world as env for this project? I have my custom world and python script for resetting world in ROS-Gazebo but I need to do that with MATLAB.
Respuesta aceptada
  Cam Salzberger
      
 el 18 de Feb. de 2021
        Hello Jen-Yu,
Loading a Simulink model that simulates a robot is very different than loading a Gazebo world. There is currently no capability for loading a Gazebo model inside of Simulink and accessing data from it directly.
You can, however, run Gazebo externally and load the model into that. You can then communicate with Gazebo using Gazebo Co-Simulation, ROS, or ROS 2, and feed the data received from that into your reinforcement learning algorithm. Gazebo Co-Simulation will provide a higher level of control over the Gazebo simulation, and may have more consistent performance. It may be easier to get started with ROS or ROS 2, though, since TurtleBot3 is set up to use it already.
-Cam
3 comentarios
  Emmanouil Tzorakoleftherakis
    
 el 19 de Feb. de 2021
				If you don't use Simulink, you don't need to use rlSimulinkEnv. But you would need to create a custom environment in MATLAB following directions here.
Más respuestas (0)
Ver también
Categorías
				Más información sobre Robotics en Help Center y File Exchange.
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!



