Shannon entropy (information theory)
8 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
I want to calculate the shannon entropy. X transmits random binary sequense (e.g 1000110010) and Y received (e.g 1000100010) with probability of 2%. Could some explain me how can I calculate the shannon entropy.
1 comentario
Akira Agata
el 4 de Jul. de 2019
Do you mean 'Channel capacity' based on the Shannon-Hartley theorem assuming 2% BER?
You don't need to use received binary sequense Y to calculate Shannon entropy, which can be determined by the probability of '0' and '1' in the transmitted binary sequense.
Respuestas (0)
Ver también
Categorías
Más información sobre Biomedical Imaging en Help Center y File Exchange.
Productos
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!