“Acupuncture boosts libido,” blared the headline of a leading Indian daily. “Acupuncture effective treatment for breast cancer,” proclaimed another. These are in effect reporting a study which examined if acupuncture is any better than drugs in treating hot flashes in breast cancer patients on anti-estrogen therapy, and found no difference. The study participants included 50 American women, nearly half of whom dropped out at 12 months.
Reading the morning paper however, one may be misled into believing the unexplored promises of acupuncture. To be fair, the full news report did possibly explain the study in more detail. Even so, such headlines are not an acceptable standard for reporting science. What gives rise to these? One reason may be a singleminded objective of grabbing people’s attention through sensationalist reporting. Another contributing factor though is that the wrong message is perpetuated right from the abstract of the original paper to the press release and onwards. Termed as “spin” in reporting parlance, the tendency to emphasise the positives of an experimental treatment is unfortunately common—as found in a study in PLOS Medicine.
Budding health reporters from India gathered at the 3rd National Health Editors and Journalists Convention and discussed the example above. They found that the abstract reported, “the additional benefit of increased sex drive in some women.” Remarkably, this is not stated in the intended outcome measures or even the results of the full paper. It is mentioned in passing in the discussion section, and even so there are no real numbers. The paper only reports that, “approximately a quarter of women reported an increased sex drive,” which means five or six women in all. The effect on sex drive nevertheless made it into the title of the press release, and onwards to daily headlines for public consumption as facts even though it is under-studied, likely invalid, and principally not applicable to the reader.
To avoid creating a false perception and avert potential harm, it is essential that the reporting of science adequately identifies limitations pertaining to study design, methods, and findings. A cheat sheet, if you may, developed by the Dartmouth Institute for Health Policy and Clinical Practice, lists the ways to do so, so that end readers may suitably weigh the message meted out to them.
This also lays bare a crucial need to train reviewers, editors, and journalists, or communicators of science at large, in the accurate dissemination of research findings. Indeed, at a session at the 7th Peer Review Congress, only a handful of the group of 500-odd editors acknowledged some formal training prior to becoming editors. This was a scene repeated at the Indian Association of Medical Journal Editors’ meeting (IAMJE) where the majority of editors admitted to being appointed based on seniority or in an honorary capacity by professional societies, with little prior training.
On a panel at the IAMJE meeting, we discussed how traditionally students don’t enrol in medical schools to become researchers and editors, and there is little, if anything, done along the way to foster this choice. An early intervention may be to encourage students to critically appraise research through dedicated forums such as journal club meetings. Another may be through editorial fellowships wherein students spend time working hands-on with a journal under the mentorship of a senior editor. For instance, the BMJ runs a Clegg scholarship programme where students learn about medical journalism through an eight week work placement. A protracted form of this is the editorial registrar programme for doctors, who take a break from clinical practice and dive deep into the journal processes. Quite a few international journals such as the JAMA, NEJM, AFP, CMAJ, among others, offer editorial fellowships. I am sure many Indian medical journals also involve students in different capacities to handle submissions and manage editorial processes. However, a formalized programme is lacking and it may be up to journals, professional societies, and research funders to support these initiatives.
In doing so, the messaging will be important as the medical fraternity tends to consider editorial and research jobs as non-clinical. To the contrary, feedback from doctors on editorial fellowships has often been that such immersion helped them see how clinical evidence is generated, its importance to medical practice, and indeed become better clinicians themselves, with enhanced ability to evaluate and use research findings in the practice decisions they make.
And while we build capacity for the effective communication of science, it may be pertinent to create informed readers as well, so they can astutely evaluate the evidence when the media advises them to eat chocolate one day, and abandon it the next!
Anita Jain is the India editor, BMJ.