spectralEntropy

Spectral entropy for signals and spectrograms

Syntax

``entropy = spectralEntropy(x,f)``
``entropy = spectralEntropy(x,f,Name=Value)``
``spectralEntropy(___)``

Description

````entropy = spectralEntropy(x,f)` returns the spectral entropy of the signal, `x`, over time. How the function interprets `x` depends on the shape of `f`.```

example

````entropy = spectralEntropy(x,f,Name=Value)` specifies options using one or more name-value arguments.```

example

````spectralEntropy(___)` with no output arguments plots the spectral entropy. You can specify an input combination from any of the previous syntaxes. If the input is in the time domain, the spectral entropy is plotted against time.If the input is in the frequency domain, the spectral entropy is plotted against frame number. ```

example

Examples

collapse all

Create a chirp signal with white Gaussian noise and calculate the entropy using default parameters.

```fs = 1000; t = (0:1/fs:10)'; f1 = 300; f2 = 400; x = chirp(t,f1,10,f2) + randn(length(t),1); entropy = spectralEntropy(x,fs);```

Plot the spectral entropy against time.

`spectralEntropy(x,fs)`

Create a chirp signal with white Gaussian noise and then calculate the spectrogram using the `stft` function.

```fs = 1000; t = (0:1/fs:10)'; f1 = 300; f2 = 400; x = chirp(t,f1,10,f2) + randn(length(t),1); [s,f] = stft(x,fs,FrequencyRange="onesided"); s = abs(s).^2;```

Calculate the entropy of the spectrogram over time.

`entropy = spectralEntropy(s,f);`

Plot the spectral entropy against the frame number.

`spectralEntropy(s,f)`

Create a chirp signal with white Gaussian noise.

```fs = 1000; t = (0:1/fs:10)'; f1 = 300; f2 = 400; x = chirp(t,f1,10,f2) + randn(length(t),1);```

Calculate the entropy of the power spectrum over time. Calculate the entropy for 50 ms Hamming windows of data with 25 ms overlap. Use the range from 62.5 Hz to `fs`/2 for the entropy calculation.

```entropy = spectralEntropy(x,fs, ... Window=hamming(round(0.05*fs)), ... OverlapLength=round(0.025*fs), ... Range=[62.5,fs/2]);```

Plot the entropy against time.

```spectralEntropy(x,fs, ... Window=hamming(round(0.05*fs)), ... OverlapLength=round(0.025*fs), ... Range=[62.5,fs/2])```

Input Arguments

collapse all

Input signal, specified as a vector, matrix, 3-D array, or timetable. How the function interprets `x` depends on the shape of `f`.

If `x` is a timetable, it can have one or more variables, and each variable can have one or more columns. The output `entropy` is a timetable. In that case, do not specify `f`.

Data Types: `single` | `double`

Sample rate or frequency vector in Hz, specified as a scalar or vector, respectively. How the function interprets `x` depends on the shape of `f`:

• If `f` is not specified and `x` is a numeric vector or matrix, `spectralEntropy` assumes `x` is sampled at a rate equal to 1 Hz. If `f` is not specified and `x` is a timetable, `spectralEntropy` infers the sample rate from `x`.

• If `f` is a scalar, `x` is interpreted as a time-domain signal, and `f` is interpreted as the sample rate. In this case, `x` must be a real vector or matrix. If `x` is specified as a matrix, the columns are interpreted as individual channels.

• If `f` is a vector, `x` is interpreted as a frequency-domain signal, and `f` is interpreted as the frequencies, in Hz, corresponding to the rows of `x`. In this case, `x` must be a real L-by-M-by-N array, where L is the number of spectral values at given frequencies of `f`, M is the number of individual spectra, and N is the number of channels.

The number of rows of `x`, L, must be equal to the number of elements of `f`.

Data Types: `single` | `double`

Name-Value Arguments

Specify optional pairs of arguments as `Name1=Value1,...,NameN=ValueN`, where `Name` is the argument name and `Value` is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and enclose `Name` in quotes.

Example: `Window=hamming(256)`

Note

The following name-value arguments apply if `x` is a time-domain signal. If `x` is a frequency-domain signal, only the `Instantaneous` and `Scaled` arguments apply.

Window applied in the time domain, specified as a real vector. The number of elements in the vector must be in the range [1, `size(x,1)`]. The number of elements in the vector must also be greater than `OverlapLength`. If you do not specify `Window`, `spectralEntropy` uses a window length that splits `x` into eight overlapping segments.

Data Types: `single` | `double`

Number of samples overlapped between adjacent windows, specified as an integer in the range [0, `size(Window,1)`). If you do not specify `OverlapLength`, `spectralEntropy` uses a value that results in 50% overlap between segments.

Data Types: `single` | `double`

Number of bins used to calculate the DFT of windowed input samples, specified as a positive scalar integer. If unspecified, `FFTLength` defaults to the number of elements in the `Window`.

Data Types: `single` | `double`

Frequency range in Hz, specified as a two-element row vector of increasing real values in the range [0, `f`/2].

Data Types: `single` | `double`

Spectrum type, specified as `"power"` or `"magnitude"`:

• `"power"` –– The spectral entropy is calculated for the one-sided power spectrum.

• `"magnitude"` –– The spectral entropy is calculated for the one-sided magnitude spectrum.

Data Types: `char` | `string`

Since R2024b

Instantaneous time series option, specified as a logical.

• If `Instantaneous` is `true`, then `spectralEntropy` returns the instantaneous spectral entropy as a time-series vector.

• If `Instantaneous` is `false`, then `spectralEntropy` returns the spectral entropy value of the whole signal or spectrum as a scalar.

This argument applies if `x` is a time-domain signal or if `x` is a frequency-domain signal.

Data Types: `logical`

Since R2024b

Scale by white noise option, specified as a logical. Scaling by white noise — or log2n, where n is the number of frequency points — is equivalent to normalizing the spectral entropy. Scaling allows you to perform a direct comparison on signals of different length.

• If `Scaled` is `true`, then `spectralEntropy` returns the spectral entropy scaled by the spectral entropy of the corresponding white noise.

• If `Scaled` is `false`, then `spectralEntropy` does not scale the spectral entropy.

This argument applies if `x` is a time-domain signal or if `x` is a frequency-domain signal.

Data Types: `logical`

Output Arguments

collapse all

Spectral entropy, returned as a scalar, vector, matrix, or timetable. Each row of `entropy` corresponds to the spectral entropy of a window of `x`. Each column of `entropy` corresponds to an independent channel.

collapse all

Spectral Entropy

The spectral entropy (SE) of a signal is a measure of its spectral power distribution. The concept is based on the Shannon entropy, or information entropy, in information theory. The SE treats the signal's normalized power distribution in the frequency domain as a probability distribution, and calculates the Shannon entropy of it. The Shannon entropy in this context is the spectral entropy of the signal. This property can be useful for feature extraction in fault detection and diagnosis [2], [1]. SE is also widely used as a feature in speech recognition [3] and biomedical signal processing [4].

The equations for spectral entropy arise from the equations for the power spectrum and probability distribution for a signal. For a signal x(n), the power spectrum is S(m) = |X(m)|2, where X(m) is the discrete Fourier transform of x(n). The probability distribution P(m) is then:

`$P\left(m\right)=\frac{S\left(m\right)}{{\sum }_{i}S\left(i\right)}\text{.}$`

The spectral entropy H follows as:

`$H=-\sum _{m=1}^{N}P\left(m\right){\mathrm{log}}_{2}P\left(m\right)\text{.}$`

Normalizing:

`${H}_{n}=-\frac{\sum _{m=1}^{N}P\left(m\right){\mathrm{log}}_{2}P\left(m\right)}{{\mathrm{log}}_{2}N}\text{,}$`

where N is the total frequency points. The denominator, log2N represents the maximal spectral entropy of white noise, uniformly distributed in the frequency domain.

If a time-frequency power spectrogram S(t,f) is known, then the probability distribution becomes:

`$P\left(m\right)=\frac{{\sum }_{t}S\left(t,m\right)}{{\sum }_{f}{\sum }_{t}S\left(t,f\right)}.$`

Spectral entropy is still:

`$H=-\sum _{m=1}^{N}P\left(m\right){\mathrm{log}}_{2}P\left(m\right)\text{.}$`

To compute the instantaneous spectral entropy given a time-frequency power spectrogram S(t,f), the probability distribution at time t is:

`$P\left(t,m\right)=\frac{S\left(t,m\right)}{{\sum }_{f}S\left(t,f\right)}.$`

Then the spectral entropy at time t is:

`$H\left(t\right)=-\sum _{m=1}^{N}P\left(t,m\right){\mathrm{log}}_{2}P\left(t,m\right).$`

Algorithms

The spectral entropy is calculated as described in [5]:

`$\text{entropy}=\frac{-\sum _{k={b}_{1}}^{{b}_{2}}{s}_{k}\mathrm{log}\left({s}_{k}\right)}{\mathrm{log}\left({b}_{2}-{b}_{1}\right)}$`

where

• sk is the spectral value at bin k.

• b1 and b2 are the band edges, in bins, over which to calculate the spectral entropy.

References

[1] Pan, Y. N., J. Chen, and X. L. Li. "Spectral Entropy: A Complementary Index for Rolling Element Bearing Performance Degradation Assessment." Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science. Vol. 223, Issue 5, 2009, pp. 1223–1231.

[2] Sharma, V., and A. Parey. "A Review of Gear Fault Diagnosis Using Various Condition Indicators." Procedia Engineering. Vol. 144, 2016, pp. 253–263.

[3] Shen, J., J. Hung, and L. Lee. "Robust Entropy-Based Endpoint Detection for Speech Recognition in Noisy Environments." ICSLP. Vol. 98, November 1998.

[4] Vakkuri, A., A. Yli‐Hankala, P. Talja, S. Mustola, H. Tolvanen‐Laakso, T. Sampson, and H. Viertiö‐Oja. "Time‐Frequency Balanced Spectral Entropy as a Measure of Anesthetic Drug Effect in Central Nervous System during Sevoflurane, Propofol, and Thiopental Anesthesia." Acta Anaesthesiologica Scandinavica. Vol. 48, Number 2, 2004, pp. 145–153.

[5] Misra, H., S. Ikbal, H. Bourlard, and H. Hermansky. "Spectral Entropy Based Feature for Robust ASR." 2004 IEEE International Conference on Acoustics, Speech, and Signal Processing.

Version History

Introduced in R2019a

expand all