“My aunt Léonie,” writes Marcel Proust in In Search of Lost Time, “wished to see invented a machine that would enable the doctor to undergo all the sufferings of his patient in order to understand better.” How, I wonder, would medicine be altered by aunt Léonie’s machine?
It’s easy to understand why aunt Léonie wanted such a machine. After her husband died aunt Léonie took to her bed and became a full-time hypochondriac, listing her ailments to anybody who would listen. Proust reports overhearing her whispering to herself: “I must not forget that I never slept a wink.” Her long suffering but amply compensated doctor was perhaps insufficiently sympathetic: her machine would make him understand just how much she suffered, although (and this no doubt never occurred to her) it could have achieved the opposite.
But I think that the main effect of her machine would be to temper the enthusiasm of doctors for uncomfortable diagnostic tests and overambitious treatments.
A friend of mine recently saw her GP with problems swallowing. As this is a “red flag” symptom she was instantly referred—not for a gastroenterological consultation, but for a gastroscopy. The gastroscopist asked if she wanted Valium or simply an anaesthetic spray. She didn’t know, but as it was “a minor, routine” gastroscopy the gastroscopist advised simply the spray. It was one of the worst experiences of her life, and another friend who had had a gastroscopy done both ways said “You should always get the Valium. It makes it less awful.”
If connected to aunt Léonie’s machine the GP might not have referred her directly for a gastroscopy, but rather for a consultation with the results of blood tests and a barium swallow, which my friend had later and found fine. And the machine might have made the gastroscopist advise the Valium or even suggest that the gastroscopy was delayed until more results were available and more questions had been asked.
Oncologists plugged into the machine might also be less adventurous with their chemotherapy for patients with little or no chance of benefitting from the treatment. Following my favourite maxim that “Good surgeons know how to operate, better surgeons when to operate, and the best when not to operate,” aunt Léonie’s machine might hasten the passage of good surgeons to the best. For, as the Russian proverb says, “The wolf cannot speak of the fear of the sheep.’”
But the machine could also help with diagnosis. When patients are struggling to describe their symptoms or the nature of their pain, doctors might switch on the machine and for a short while experience the symptoms themselves.
The machine could also help doctors determine the threshold for pain and other kinds of suffering of patients. Once they knew that a patient had a particularly low threshold they could be especially careful with their procedures and investigations.
Presumably aunt Léonie would insist that her machine could relay every kind of suffering, including deep depression and psychosis. Might there be a danger that doctors would kill themselves when suddenly inflicted with deep depression or be unable to “bring themselves back” from an acute psychotic episode?
This leads us to the great problem with the machine: why would anybody want to be a doctor if every day they had to experience directly the suffering of every one of their patients? Life would be unbearable, and I hope that aunt Léonie would allow her machine to be used only occasionally. But who would decide when it was to be used? For aunt Léonie it surely had to be the patient, but perhaps this is a case for the fashionable “shared decision-making.”
Aunt Leonie’s machine has probably until now been considered a philosophical machine, a device for a thought experiment. But a combination of artificial intelligence and artificial reality may now mean that such a machine is possible. Perhaps soon it will be a fundamental component of medical education and practice.
Richard Smith was the editor of The BMJ until 2004.
Competing interests: None declared.
Patient consent obtained.