How to transform a vector into a binary matrix with row constraint ?

1 visualización (últimos 30 días)
Hi!
i have this vector v=[1,4,3,2], I want to transform it into a binary matrix under the following constraint: each row can contain only one '1'.
this is the required result
V=[1 0 0 0
0 1 0 0
0 1 0 0
0 1 0 0
0 1 0 0
0 0 1 0
0 0 1 0
0 0 1 0
0 0 0 1
0 0 0 1]
can anyone help me?
  2 comentarios
dpb
dpb el 6 de Sept. de 2022
How did you invent the required result from the input?
Safia
Safia el 6 de Sept. de 2022
I haven't got it yet, I just want the result in this form.

Iniciar sesión para comentar.

Respuesta aceptada

Dyuman Joshi
Dyuman Joshi el 6 de Sept. de 2022
From the apparent relation between v and the required matrix -
v=[1,4,3,2];
y=zeros(sum(v),numel(v));
z=1;
for i=1:numel(v)
y(z:v(i)+z-1,i)=1;
z=z+v(i);
end
y
y = 10×4
1 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 0 0 1 0 0 0 1

Más respuestas (0)

Categorías

Más información sobre Elementary Math en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by