Splitting the input layer of deep neural network (used for the actor of a DDPG agent)
11 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Hello everyone
I am using the DDPG agent to control my robot. I want to design a neural network with architecture similar to the figure below for my actor. Ideally, I want to deploy an imageInputLayer with size [17 1 1] as inputs and then simply split these inputs into two branches, which each one connected only to nine elements of inputs(one element is shared) and ends at a different output neuron. Finally, these two neurons should be concatenated. I appreciate it if someone illustrates how I can do this.

0 comentarios
Respuestas (1)
Anh Tran
el 18 de Sept. de 2020
You can define 2 observation specifications on the environment. Thus, the agent will receive splitted input to begin with. Moreover, since your observation are vector-based, you can try featureInputLayer (R2020b) instead of imageInputLayer.
obsInfo1 = rlNumericSpec([9,1]);
obsInfo2 = rlNumericSpec([9,1]);
obsInfo = [obsInfo1 obsInfo2];
1 comentario
Heesu Kim
el 21 de En. de 2021
Hi.
Is there any other things that must be modified following the obs separation?
I am trying actor-critic model with separate observation input (exactly the same as the question), and modified actor and critic object as following.
(before)
actor = rlStochasticActorRepresentation(actorNetwork,obsInfo,actInfo,...
'Observation',{'state'},actorOpts);
(after)
actor = rlStochasticActorRepresentation(actorNetwork,obsInfo,actInfo,...
'Observation',{'state1, state2'},actorOpts);
However, I'm getting an error like
Caused by:
Error using
rl.representation.rlAbstractRepresentation/validateInputData
(line 507)
Input data must be a cell array of
compatible dimensions with observation
and action info specifications.
I was not able to find where should I change.
Is there something else to be modified following the obs separation?
Ver también
Categorías
Más información sobre Policies and Value Functions en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!