ABSTRACT
Proof that the Uniform Distribution has Maximum Entropy for Discrete Variables Subject Only to the Normalization Constraint
Given a discrete random variable X having N possible outcomes and governed by a probability mass function fX(xi), we wish to prove that the Uniform distribution
fX(xi) = 1/N ∀ i ∈ {1, 2, · · · , N} (7.1) is the distribution that maximizes the discrete Shannon entropy
HX = − N∑ i=1
fX(xi) log2 (fX(xi)) (7.2)
subject to the normalization constraint
fX(xi) = 1. (7.3)
We use the method of Lagrange multipliers. First, form the functional
G(fi, λ) = HX − λ (
fi − 1 ) , (7.4)
where we have used the simplified notation fi ≡ fX(xi), and then set all N derivatives equal to zero
∂G(fi, λ)
∂fi = log2 fi + fi(1/fi)− λ = 0. (7.5)
Hence, log2 fi = λ− 1 for all i, or fi = 2
λ−1. (7.6)
of
constraint
fi = 1
2λ = 1 (7.7)
from which we find λ = log2 2− log2N . Hence,
fi = 2 λ−1 =
N (7.8)
which is just the Uniform distribution.