We have been discussing the role of the humanities in medical education, and the need to account for what one of us calls ‘medical paranoia’. By this we mean the tendency that medical students (and practising doctors) have to think that they have developed serious illnesses, making self-diagnoses frequently based on vague suggestions rather than hard evidence. We feel that it is time to reflect on the significance, meaning and potential utility of this phenomenon.
Hypochondria among medical students is common, and the butt of jokes among those who talk and write about the experience of being a medical student. A middle-aged woman who had survived bowel cancer once said during a research interview that she no longer developed flu: it was always metastatic cancer. In the same way, medical students tend to develop acute leukaemia rather than viral sore throats, cancer or HIV (depending on the term they happen to be doing) instead of natural fatigue. One of us (ML) convinced himself during his medical course in the 1950s, that he suffered from tuberculosis, bone cancer, stomach ulcer, bronchiectasis, several kinds of leukaemia and lymphoma, hepatitis, thyrotoxicosis, hypothyroidism and Addison’s disease. There were many other conditions. All systems were affected at one time or another during the six years of study.
ML’s hypochondria was not unique. Almost every medical student we have both known has been convinced that he or she had developed something potentially fatal at some stage. What makes it worse is that the occasional person is right. Real tuberculosis, real depression, bowel cancer or inflammatory bowel disease among contemporaries tend to reinforce the need for fearful vigilance.
Does hyphochondria serve any useful function? The experience can be very unpleasant. Vague symptoms preoccupy the waking hours, interfering with enjoyment of life, with relationships, with one’s optimism about the future, sometimes with one’s ability to sleep. Fortunately, most hypochondriacal illnesses among medical students seem to be self-limiting, and to run their courses in a few troublesome weeks.
Actual major illness changes one’s perceptions of the world, and often changes the sense of identity. It forces confrontations with mortality and the fragility of being human. It may make the sufferer lonely, because it’s impossible to communicate the nature of the experience. It also makes the doctor who has been ill more understanding of the anxieties and sufferings of others.
Clearly, medical education can’t enforce organic illness on its trainees. The current fashion for role playing may help to sensitize students to issues of communication and intersubjectivity. But role playing is no substitute for the deeper personal involvement of the hypochondriac. In the grip of the imagined disease, he or she reads the relevant literature with the greatest attention, seeking the clinical nuances that might confirm or rule out the current threat to life, welfare and identity. Such subtleties stay in the memory, and help the recovered valetudinarian to understand, question and advise patients and their families with real and imagined illness.
Hypochondria may not be a bad attribute to look for in medical students – not too much of it, mind you, because it can be inhibiting for the hypochondriac and deadly boring for his or her colleagues. It might be a little difficult to quantify at interview for entry to medical school, but we should remain open to its potential virtues. We should certainly not exclude candidates who examine their own potential to become ill, and imagine themselves, at least temporarily, into a world of virtual illness.
Miles Little MD, MS, FRACS and Claire Hooker, PhD
Centre for Values, Ethics and the Law in Medicine
University of Sydney
Corresponding author: Emeritus Professor Miles Little
Centre for Values, Ethics and the Law in Medicine
Building K25
University of Sydney
Sydney, NSW 2006
Australia
e-mail: milesl@ozemail.com.au
Telephone: 61290363405