ABSTRACT

When computers were first introduced, many of their users employed them only to calculate numerical values from formulae, in much the same way as they had earlier used slide rules and other analogue devices. This use of the digital machines as 'super slide rules' is nowadays called 'first-generation modelling'. Just to take the simplest possible example, a solution of a differential equation such as df/dt = -afwith as initial conditions f(t = 0) = fo would be represented by its 'solution' in 4Real, f= fo exp (-at), and this then expressed in its turn as a convergent polynomial expansion, as this alone was amenable to arithmetic computation, and then again only for some region of convergence. The process of solution was, so to say, kept as long as possible in the continuum domain, only being translated into the first-order language of arithmetic 'at the very last moment'.