By Charlotte Blease.
Every morning you feel like you’re in a dinghy in the middle of the sea. What is causing these horrendous dizzy spells? You turn to the ever-obliging Dr Google which offers a variety of possible causes. After refining your search, you suspect it could be an ear infection rather than something more ominous like an impending stroke.
You’re going through one of life’s slumps, and are seriously depressed, even suicidal. You want some discrete advice about medications. Again, you turn online to seek health information.
Or you spot a lump in your breast, and don’t know whether it’s cancer. While awaiting a medical check-up, you feverishly search for clarification online – after all, it’s quicker, and might help you know what to ask in the appointment.
Hunting for health information is second only to porn searches on the internet.
Yet Dr Google is not exactly an avuncular medic with our best interests at heart. Nor are internet search engines priestly confessionals that are deity-duty bound to keep our secrets sacred. Rather, the business model behind these web giants is the buying and selling of our data – gleaned from what we ask of the internet, and how we use it. In most countries worldwide, the laws don’t protect us from the most insidious forms of this monetization model which is called “surveillance capitalism”. This makes tech giants the world’s biggest benefactors of our sensitive clinic confessions. Internet companies actively sell our sensitive data to advertisers who know how to stalk us online. They sell tip offs about the seriousness of our ailments to third party companies that can affect health insurance coverage, employment discrimination, and other high stakes decisions.
If searching for online information about our symptoms is risky, in the era of online record access – where we obtain rapid internet access to our medical documentation including test results – the perils are even more hazardous.
In this paper, I discuss why patients are now at greater risk of privacy risks from ‘cutting and pasting’ highly detailed information into search engines. Nor do the risks stop with Google. A more sophisticated generation of internet tools – conversational chatbots such as ChatGPT – may be even slicker at extracting our medical intel. Privacy risks aren’t a strong reason to deny online record access. However, health visits don’t happen in an internet vacuum. They arise in an internet-embedded ecosystem where people will seek medical clarifications, and further advice beyond what their doctor communicates in the visit. In the era of “open notes” we need to be more transparent about this risk. Patients, the medical community, and civic bodies need to talk about what happens to our sensitive clinical data when we, inevitably do, disclose it online.
Paper title: Open AI meets open notes: surveillance capitalism, patient privacy and online record access
Author: Charlotte Blease
Affiliations: Participatory eHealth and Health Data Research Group, Department of Women’s and Children’s Health, Uppsala University, Uppsala, Sweden and Digital Psychiatry, Department of Psychiatry, Beth Israel Deaconess Medical Center, Boston, USA.
Competing interests: None declared
Social media: @crblease