Experts and evidence

Every now and then you read something and it chimes with you to illuminate niggle you’d not known was there – and the flash of understanding makes you delight. Really good qualitative research can do that, as can fiction, drama, an off-the cuff comment, and – rarest of all – the output of a Working Group.

The GRADE gang have been, for over a decade, improving and refining our understanding of how guidelines should be created. Clinical practice guidelines are the most concrete expressions of evidence based medicine. They are often rather ugly, unsubtle, and colourless, but superb to run to in clinical storms. There has always been a reliance on making them relevant, just like any other facet of EBM, by integrating the raw science with clinical expertise and the individual patient. Defining how far ‘expert’ can and should influence a decision is tricky.

What the group have recently published on is differentiating the added-value experts bring. They can bring their experience and expert knowledge, and also their expert opinions. We should be clear about the difference. Having seen over a dozen cycles of melphalan given (an unpleasant chemo used before stem cell transplant procedures), and so having more evidence than the world literature, I can bring experience of seeing vomiting and nausea despite loading with antiemetics in more than half of the kids. I also have an opinion that melphalan is a nasty, emetogenic drug, which we need to be on our guard for with rescue medicines. The opinion derives neatly from the evidence. If also have an opinion that Morris Dancing should not be undertaken by any human. I have never* Morris Danced. This is an evidence-free opinion. (It’s still correct though.)

When we’re doing guideline creating or EBM with patients, we should be transparent about what we are bringing, and how the evidence and opinion are linked an aligned. Expert evidence is different than experts opinions. We need to be clear.

  • Archi

(Visited 271 times, 1 visits today)