Paul Glasziou: From mummified evidence to living EBM—a few tools

On a tour of WHO headquarters, in Geneva, I wandered past a vast cellar of shrink wrapped unused and unread guidelines. It occurred to me that, given around 7% of clinical “facts” become outdated each year, these guidelines were rapidly passing, or already past, their “use by” date [1]. While glossy journals, 500 page systematic reviews, and grand guidelines are all worthy, clinical impact only occurs when someone reads, digests, and acts on the information.

So what does evidence based medicine (EBM) look like in the hurly burly of a clinical setting?  Most of the time we would see nothing different: history taking, examination, test ordering, empathetic care, communication with patients, etc. EBM is an episodic and cumulative method for updating and improving our knowledge base.  From interviews with EBM practitioners from several disciplines, and from my own GP practice, there are three key activities I would look for to see if an individual or team was using an evidence based approach:

1. A clinical questions “log book.” While good clinicians have vast knowledge in their clinical niche, knowing everything is impossible. Several times a day, or even per consultation, questions arise. Some might be answered immediately; some will be deferred. Before forgetting the question, it needs to be recorded in a “log book”—whether paper, electronic, or even a shared whiteboard. On ward rounds, it may be an “educational prescription” given to one of the team, with a copy kept for later follow up. Some questions may be answered that day, some over tea, some by email to a clinical librarian or literature searching service, depending on time, skills, and resources. But not any answer will do—the searcher should have the skills to find and appraise the best available evidence (there are whole books and courses on this, so I won’t expand here).

2. An evidence based research alerts service. For some important new evidence, we might not have even imagined the question, but will need to be alerted to it. Of course, we are immersed in bad alert services: news columns, colleagues, table of contents of a few favourite journals, etc. Making sense of these muddied floodwaters is energy draining. Better to have a trustworthy evidence based team filter these torrents, and provide summaries of the few relevant, valid research articles. A good model for an alert service is the ACP journal club, which scans 120+ journals, checks new articles for validity (95% fail here), then gets clinicians to vote on relevance to practice [2], and summarises the best.

3. Team discussions of evidence.  Important issues may emerge from the logbook or alerts that require a team discussion of the evidence—and what to do about it. Usually labelled as “journal club,” these EBM discussions differ from traditional journals clubs considerably: the topics are based on question logbooks or evidence alerts, not a casual scanning of a handy journal; the discussion uses and appraises the best evidence, not the most readily available; and a clinical bottom line is reached. And the clinical bottom line may not be enough to implement any change. “Next actions” may require training, equipment, audits, or other information needed, which may need longer term follow up [3].

There will be many other EBM activities. But without these core ones, the edifice of evidence is just a silent tomb full of mummified information that does not touch living clinical practice and improve the care of patients.

Paul Glasziou is professor of evidence based medicine at Bond University and a part time general practitioner.

1.    Shojania KG, Sampson M, Ansari MT, Ji J, Doucette S, Moher D. How quickly do systematic reviews go out of date? A survival analysis. Ann Intern Med. 2007 Aug 21;147(4):224-3
2.    Eady A, Glasziou P, Haynes B. Less is more: where do the abstracts in the EBM journal come from? Evid Based Med. 2008 Feb;13(1)
3.    Glasziou P. Applying evidence: what’s the next action? Evid Based Med. 2008 Dec;13(6):164-5