ABSTRACT

The development of a means to accurately and safely visualize the entire colon endoscopically has revolutionized the diagnosis and management of colonic diseases and, indeed, the clinical practice of gastroenterologists and colorectal surgeons alike. While instruments to examine the anus and rectum had been available since antiquity, it was not until the advent of flexible fiberoptic technology in the late 1950s that total colonoscopy became more than a dream (1). The adaptation of fiberoptics to the colon was more difficult compared to esophagogastroduodenoscopy due to the tortuous colonic lumen. Nevertheless, the first commercially available fiberoptic colonoscope, developed by Overholt, appeared in the late 1960s (2). The early acceptance of colonoscopy into clinical practice was slow because of limited tip deflection and field of view, unfamiliarity with the technique, and reluctance to use sedation and analgesia to reduce patient discomfort. Subsequent advances in instrument development and colonoscopy technique in the 1970s and the demonstration of safe colonoscopic biopsy and snare polypectomy (3) led to more widespread application of colonoscopy by clinicians. Further advances in colonoscope development, techniques for instrument manipulation, incorporation of other diagnostic and therapeutic applications of colonoscopy, and the development of more effective and tolerable means of bowel preparation and patient sedation have currently made colonoscopy the primary method of imaging the colon. Indeed, with recent guidelines and recommendations endorsing colonoscopy as a screening option for colorectal cancer (4-6), the demand for colonoscopy is likely to increase substantially. While the feasibility of this strategy remains problematic due to concerns of the high demand for colonoscopy and an inadequate number of colonoscopists and infrastructure (7), it also highlights the importance of fully trained, competent colonoscopists.