ABSTRACT
Definition: Entropy (H) is defined as the average amount of information per symbol:
H ¼ Xn m¼1
PðmÞ log 1 PðmÞ ð1:4:3Þ
where n defines a number of symbols in the alphabet and H is measured in bit/symbol.
Definition: Entropy (H) is defined as the average amount of information per symbol:
H ¼ Xn m¼1
PðmÞ log 1 PðmÞ ð1:4:3Þ
where n defines a number of symbols in the alphabet and H is measured in bit/symbol.