ABSTRACT

By 1940 the concept of essential nutrients was well established; they were defined as chemical substances found in food that could not be synthesized by the body to perform functions necessary for life. In the 1960s and 1970s, the standard for essentiality was liberalized for mineral elements that could not be fed at dietary concentrations low enough to cause death or interrupt the life cycle (interfere with growth, development, or maturation such that procreation is prevented). Thus, an essential element during this time period was defined as one whose dietary deficiency consistently and adversely changed a biological function from optimal, and this change was preventable or reversible by physiological amounts of the element. This definition of essentiality became less acceptable when a large number of elements was suggested to be essential based on small changes in physiological or biochemical variables. Many of these changes were questioned as to whether they were necessarily the result of a suboptimal function, and sometimes were suggested to be the consequence of a pharmacologic or toxic action in the body, including an effect on intestinal microorganisms. As a result, if the lack of an element cannot be shown to cause death or to interrupt the life cycle, many scientists, perhaps a majority, now do not consider an element essential unless it has a defined biochemical function. However, there are still scientists who base essentiality on older criteria. Thus, no universally accepted list of essential trace elements exists. Nonetheless, it is hoped that most of the mineral elements that are essential, possibly essential, or beneficial have been included in this section.