ABSTRACT

Information and communication technology (ICT) has shaped the second half of the twentieth century irreversibly and more profoundly than atomic energy or space exploration. We may well do without nuclear power stations or the space shuttle, but nobody can reasonably conceive of a future society in which there are no more computers, no matter what dangers may be inherent in such an ineluctable evolution of our habitat. Evidence of a digital destiny is everywhere. In 1971, Intel launched the world’s first commercial microprocessor, the 4004 (Intel is to hardware as Microsoft is to software: its chips account for 80 per cent of world-wide sales of general-purpose microprocessors), and some twenty-five years later microprocessors are hidden in virtually any technological device supporting our social activities, financial operations, administrative tasks or scientific research. At the beginning of 1998, experts estimated that there were more than 15 billion chips operating all over the world. In advanced societies, computer chips have become as widely diffused as the engines they control and ICT is today a highly pervasive, infra-technology that will eventually cease to be noticed because it has become thoroughly trivial.1