ABSTRACT

This chapter provides a brief introduction to one application of lattice basis reduction to number theory. We begin by reviewing the classical algorithm for computing the continued fraction expansion of a real number; the partial convergents give good rational approximations to an irrational number. We then consider the more general problem of computing simultaneous rational approximations (all with the same denominator) to a finite set of real numbers; this is where the LLL algorithm can be applied.