ABSTRACT

Sequences arise naturally when we want to approximate quantities. For instance, when wish to use decimal expansion for the rational number 1/3 we get a sequence 0.3, 0.33, 0.333, . . .. We also understand that each term is approximately equal to 1/3 up to certain level of accuracy. What do we mean by this? If we want the difference between 1/3 and the approximation to be less than, say, 10−3, we

may take any one of the decimal numbers 0.