Entropy in information theory
Entropy in information theory:
Average amount of Information per individual message is known as Entropy.
It is a very important topic in Information theory and Coding.
Check the following video for detail explanation:
Note:
If you want the actual derivation , check this:
http://clem.dii.unisi.it/~vipp/files/TIC/dispense.pdf(Page 7 , equation 1.20)
Average amount of Information per individual message is known as Entropy.
It is a very important topic in Information theory and Coding.
Check the following video for detail explanation:
Note:
If you want the actual derivation , check this:
http://clem.dii.unisi.it/~vipp/files/TIC/dispense.pdf(Page 7 , equation 1.20)
No comments