2D matrix to 3D matrix given index in the 2D matrix col1
1 visualización (últimos 30 días)
Mostrar comentarios más antiguos
Dave
el 17 de Dic. de 2014
Comentada: Dave
el 18 de Dic. de 2014
Hello, I need to construct a 3D matrix using a 2D matrix.
As an example, take matrix A which is 7x4.
First element of first column (date format) is repeated at most 2 times in all cases.
Am looking for the last 2 elements of each row to finally form layers of 2x2 matrices, one for each date:
A= ...
[735950 1 3 5;
735951 1 3 5;
735953 1 4 6;
735950 2 1 7;
735951 2 2 8;
735952 2 5 9;
735953 2 3 9]
So, in the first case (735950) I need to form a 3D matrix B where the FIRST layer is
B(:,:,1)= ...
[3 5;
1 7]
Note A is not in ascending order. For this FIRST layer the 1st row of B is in row1 of A, but the 2nd of B row is in row4 of A.
The second case (735951) SECOND layer is
B(:,:,2)= ...
[3 5 ;
2 8]
The third case (735952) since it is only repeated one time, should be ignored
The fourth case (735953) should be the THIRD layer
B(:,:,3)= ...
[4 6 ;
3 9]
Thanks for any suggestions.
0 comentarios
Respuesta aceptada
Shoaibur Rahman
el 18 de Dic. de 2014
C = sortrows(A,1);
indx = 0;
for k = 1:size(C,1)-1
if C(k,1) == C(k+1,1)
indx = indx+1;
B(:,:,indx) = C(k:k+1,end-1:end);
end
Más respuestas (0)
Ver también
Categorías
Más información sobre Matrix Indexing en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!