ABSTRACT

The "cholesterol hypothesis" holds that increasing concentrations of serum cholesterol will raise the risk for coronary heart disease (CHD), whereas decreasing of serum cholesterol levels will reduce the risk (Fig. I). This hypothesis is rooted in studies in experimental animals that were carried out early in the twentieth century. In these studies, feeding large amounts of cholesterol to animals led to marked hypercholesterolemia and to cholesterol accumulation within the arterial wall. This accumulation resembled the first stages of human atherosclerosis. Later, premature atherosclerotic disease was observed in people who had very high concentrations of serum concentrations. These congruent findings created increasing interest in the cholesterol hypothesis and laid the foundation for a great expansion of research in the second half of the 20th century. Between 1950 and 1975, newly acquired data pertained mainly to the ascending aim of the cholesterol hypothesis, i.e., increasing cholesterol levels impose greater risk (Fig. lA); the last quarter of a century has witnessed a growing body of data supporting a reversal of risk by the lowering of serum cholesterol (Fig. IB).