Shannon entropy (information theory)

8 visualizaciones (últimos 30 días)
Saravanan Mani
Saravanan Mani el 3 de Jul. de 2019
Comentada: Akira Agata el 4 de Jul. de 2019
I want to calculate the shannon entropy. X transmits random binary sequense (e.g 1000110010) and Y received (e.g 1000100010) with probability of 2%. Could some explain me how can I calculate the shannon entropy.
  1 comentario
Akira Agata
Akira Agata el 4 de Jul. de 2019
Do you mean 'Channel capacity' based on the Shannon-Hartley theorem assuming 2% BER?
You don't need to use received binary sequense Y to calculate Shannon entropy, which can be determined by the probability of '0' and '1' in the transmitted binary sequense.

Iniciar sesión para comentar.

Respuestas (0)

Categorías

Más información sobre Biomedical Imaging en Help Center y File Exchange.

Etiquetas

Productos

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by