Truths, Stories, Fictions, and Lies
Truth telling, in medicine, is a peculiar cultural practice. In the United States, doctors seem to endorse it more than doctors in most other places, and we endorse it more strongly today than ever before. It seems to play a central role in our conception of the proper relationship between patients and doctors. Any deviation from a standard of total honesty is seen as a doctor’s way of preserving paternalistic power over patients. Truth telling is thus seen as necessary for patient empowerment, and patient empowerment is seen as an essentially good thing. We strive to demystify illness and imagine that truth is the first step toward clarity without supernaturalism.