## ABSTRACT

In this chapter, statistical mechanics is developed on the basis of classical thermodynamics. This non-traditional derivation starts from the recognition that the random nature of heat does not allow defining the dynamic state of a thermodynamic system (of N molecules) in terms of coordinates, x
^{
N
}(t), and momenta, p
^{
N
}(t), at time, t. Furthermore, the phase space defined by [x
^{
N
}(t),p
^{
N
}(t)] provides more information that can be obtained experimentally. Thus, measuring a macroscopic thermodynamic parameter, such as the temperature, is an averaging process, where the thermometer feels the (time) average of the kinetic energy of a tremendous number of microscopic states (x
^{
N
}
p
^{
N
}). Therefore, it is plausible to look for an (initially unknown) probability density, P(x
^{
N
}
p
^{
N
}), with which thermodynamic quantities become statistical averages over phase space. In particular, the Helmholtz free energy (A) becomes a statistical average of the interaction energy, E(x
^{
N
}
p
^{
N
}), and an undefined entropy function, Ŝ(x
^{
N
}
p
^{
N
}). Since we know from thermodynamics that the entropy is positive, extensive, and relies on lnP (see ideal gas), we assume, Ŝ ~ −Cln[(P(x
^{
N
}
p
^{
N
})a^{N}b^{N}
], where a, b, and C are unknown parameters. The Boltzmann probability density, P
^{B} is obtained by minimizing A(P) with respect to P. The uncertainty principle requires, a = h
^{3} (h is Planck’s constant), and to keep S extensive for undistinguishable particles, one should introduce N! ~ (N/e)^{
N
}, or b = N/e; finally, from compatibility with the experiment, C = k
_{B} – Boltzmann’s constant.