ABSTRACT

The memory feat of reciting the order of a shuffled deck of 52 cards is worth about 223 bits.

The subject of information theory, of which entropy is the central concept, was born in 1948 when Claude E. Shannon (1914-2001) published his landmark paper [Sha48] on information sources and channels. The field has since borne many beautiful results both pure and applied. Information theory is the practical motivation for error-correcting codes.