This chapter expands on this discussion more formally through the important notion of entropy. If an event A is quite typical, one may pay little attention to it; by contrast, occurrence of an unusual event merits our attention, and conveys more information due to its rarity, i.e., information is inversely related to probability. The word entropy is used in statistical physics to denote the tendency of ordered systems to undergo decay, and break down into complete chaos – one may think of any homogenizing process, such as digestion, the action of a centrifuge, or the mixing of diverse gases. The chapter shows that large entropy is associated with highly uncertain experiments, where prediction is the most difficult – such situations are intuitively linked to the notion of chaos, the opposite of highly ordered and predictable states of nature.