Why is the derivative of a function or signal one sample shorter that the orignal?

15 visualizaciones (últimos 30 días)
Hi All
taking the derivative of y=x.^2 using diff(y) , between 0 and 100 gives a signal with one less Sample, meaning instead of length of 101, it is 100 samples. so every derivative, it will get one sample shorter and if I have to integrate again, it will need those samples again maybe ? what should I do ?

Respuesta aceptada

Star Strider
Star Strider el 11 de Abr. de 2020
Use the gradient function to calculate the numerical derivatives. The output will be the same length as the input. (It is also more accurate in that respect.) The function assumes regularly-sampled data, however if the sampling intervals are not constant, a work-around for that is:
dydx = gradient(y) ./ gradient(x);
.

Más respuestas (1)

Cris LaPierre
Cris LaPierre el 11 de Abr. de 2020
diff is taking the difference between adjacent data points. Because the difference is between two points, this causes the result to be one data point shorter.
For a simple array x=[3 4 5], diff(x)=[4-3 5-4] = [1 1].
  2 comentarios
farzad
farzad el 11 de Abr. de 2020
I see, So : How would you handle it ? should the derivative of a signal be shorter always ?
farzad
farzad el 11 de Abr. de 2020
If the signal is over time, then it will be somehow meaningless or meaningfull to have a shorter signal for acceleration or velocity of that signal ?

Iniciar sesión para comentar.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by