ABSTRACT

No doubt there were nineteenth century manifestations of the idea that a person – or more frequently, a population – hung precariously between health and illness (such as the attempts to control the health of prostitutes near military establishments with the Contagious Diseases Acts), but it was the child in the twentieth century that became the first target of the full deployment of the concept. The significance of the child was that it underwent growth and development: there was therefore a constant threat that proper stages might not be negotiated that in its turn justified close medical observation. The establishment and wide provision of antenatal care, birth notification, baby clinics, milk depots, infant welfare clinics, day nurseries, health visiting and nursery schools ensured that the early years of child development could be closely monitored (Armstrong 1983). For example, the School Medical (later Health) Service not only provided a traditional ‘treatment’ clinic, but also provided an ‘inspection’ clinic that screened all school children at varying times for both incipient and manifest disease, and enabled visits to children’s homes by the school nurse to report on conditions and monitor progress (HMSO 1975).