ABSTRACT

C P K= E( , ), (12.1) where P, E(), K, and C are the plaintext, the cipher’s encryption algorithm, key, and ciphertext, respectively.

In Shannon’s theory [1], the cipher’s plaintext P and ciphertext C are regarded as the communication channel’s input and output, respectively. Then, the entropy of the input is defined as

H P p pi i i

( ) log ,= - =

where pi (i = 0, 1, …, n - 1) is P ’s distribution probability. Entropy is really a notion of self-information, that is, the information provided by a random process about itself. Conditional entropy is defined as

H P C p pi P C i i

( | ) log ,| ,= - =

where pP C i| , denotes P’s conditional probability with respect to C. This information can be used to measure the noise that changes P to C. Average mutual information is defined as

I P C H P H P C H C H C P( , ) ( ) ( | ) ( ) ( | ).= - = - (12.2) Mutual information is a measure of the information contained in one process

about another process. It is in close relation with the correlation between the two variables. Additionally, it can be defined as

I P C H P H C H P C( , ) ( ) ( ) ( , ),= + - (12.3) where H(P,C) is the pair information of P and C.