ABSTRACT
The theory that early life exposures might influence health later in life has been around
since the beginning of the last century (1). The concept is perhaps most widely known with
respect to early life nutrition and subsequent cardiovascular disease as adults, a hypothesis
put forth by Professor David Barker and colleagues (2-4) and generally referred to as
“the Barker hypothesis.” While the Barker hypothesis generally refers to nutritional
deficiencies in early life, the wider concept of environmental exposures in early life
affecting risk of adult disease-often referred to as “Critical Developmental Windows for
Exposure”—has also been pursued in relation to other health endpoints. In particular, there
is a long history within the field of mental health of examining the association between
in utero exposures and adult schizophrenia (5,6). Could early life exposures affect other
aspects of neurologic functioning in adulthood? As the population of many countries is
aging rapidly, these questions take on added importance.