Mean of selected range of a matrix based on a range of values from another matrix
8 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Adi Purwandana
el 10 de Oct. de 2024
Hello everyone,
I have a mat file (attached) containing 4 parameters: month, sa, ta, and sig. My intention is to get the mean and standard deviation from each month of sa and ta at a specific range of sig (let's say the value of sig : 27.4 - 27.5).
So, the intended output should be like this:
Thank you!
2 comentarios
Shivam Gothi
el 10 de Oct. de 2024
What I understand is, you want to find the mean and the standard deviation of only those set of values of "ta" and "sa" for which "sig" is within the range of 27.4 - 27.5. Also, the "sig_range" is different for different months.
Is my understanding of question corrrect ?
Respuesta aceptada
Voss
el 10 de Oct. de 2024
load('data_my.mat')
T = table(month,sa,ta,sig);
% only sig 27.4 to 27.5
idx = sig >= 27.4 & sig < 27.5;
G = groupsummary(T(idx,:),'month',{'mean','std'},{'sa','ta'})
% for reference, all sig
G = groupsummary(T,'month',{'mean','std'},{'sa','ta'})
4 comentarios
Más respuestas (0)
Ver también
Categorías
Más información sobre Numeric Types en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!