Towards More Thoughtful Evidence Communication

Reflection by Aleksi Raudasoja

In the end of my training in medical school, I was having an identity crisis as a medical doctor. In medical school, I was taught to follow practice guidelines and many times told how they represent the best available evidence. Nice, I thought, sounds like I’m not going to make mistakes as long as I’m following them. But then I took a deeper look and was quite confused to find that a majority of these topics were actually controversial according to the leading medical journals. Who to trust now? It was a wake up call for me. I realized that whenever I really want to know something, I need to try to find the evidence. Although I’m thankful not only to my teachers but also to guideline panelists as they have made my work as a doctor a lot easier, they still remain authority and eminence to me and are not the evidence.

To understand these controversies, it is essential to learn about cognitive biases which modify thinking. When thinking about recent incidents in British or American politics, one might quite quickly come to the conclusion that the majority do not base their decision making on facts. I think it is the same in medicine: doctors also can’t make their decisions based solely on facts. It is an illusion to think that your group of people and other like-minded individuals are those who can make the most rational choice.[1] For example, people tend to favor the evidence that supports their preconceived opinion, and in conflict situations they favor opinions presented by people who share their group identity.[2]

Let’s think what these biases could mean in real life. You have been treating seriously ill patients with a certain condition using a certain standard approach; you were confident that your patients would have died without the treatments you gave them. Then someone tells you that there is robust evidence showing that your treatment has no effect. Would you find out what that evidence is and—after agreeing that it has no effect—stop using that medication for the next patient with a similar condition? What if the patient dies after the original treatment regime has stopped? Do you go back and initiate that treatment with future patients? Did I remember to mention that other doctors in your department use this treatment, which is also recommended by guidelines? Imagine yourself as an experienced doctor who has worked for decades. Someone shows you robust evidence that the treatment you have used does harm to patients. Would that be easy to accept?

Biased decision making may lead to delays in adapting new evidence to clinical practice and sometimes to fierce opposition that afterwards seems unreasonable.[3],[4] To avoid these problems, considering how new evidence is going to be received should always be part of communicating it. People don’t like to be told that they are wrong. We also fear being wrong. We want to hear that we are doing a good job. We want recognition and admiration from our coworkers and patients; to hear that we have high ethical standards in our care and that we are nice to work with. So why not tell that to people before starting to discuss the evidence? In my experience nearly all doctors are trying to do their best for their patients despite sometimes harming them. Would it be easier to accept your mistakes after being seen as a competent decision maker rather than as a person who has made a mistake? This was studied in an American study of 82 undergraduate students. The authors divided students into two groups and showed that after positive feedback their attitudes changed more towards counter attitudinal evidence than those who were not given feedback at all.[5]

Another way to make your message easier to accept is to stay humble. Have you ever been in an intense situation where suddenly someone admits that he made a mistake and is sorry about that? How did that affect you? Did you think ‘what a stupid person’ or ‘oh, this is a wise person who can learn from their mistakes’? As humans we want to do the right thing but often we just can’t do it. So why don’t we start to teach medical students and the lay public about how fragile we often are as decision makers. I believe that, were we to do so, we would also be seen as ‘wise people’.

Acknowledgments to Dr. Kari Tikkinen and Dr. Ray Moynihan for insightful comments on the article.

 

[1] Pronin, E. (2007). Perception and Misperception of Bias in Human Judgment. Trends in Cognitive Sciences, 11(1), 37-43, ISSN 1364-6613.

[2] Lord, C.G., Taylor, C.A. (2009) Assimilation: Effects of Assumptions and Expectations on the Interpretation of New Evidence. Social and Personality Psychology Compass, 3, 827-41.

[3] A.E., Lau J, Kupelnick, B., Mosteller, F., Chalmers, T.A. (1992). Comparison of Results of Meta-Analyses of Randomized Control Trials and Recommendations of Clinical Experts: Treatments for Myocardial Infarction. JAMA, 268, 240–8.

[4] Diamond, G.A., Kaul, S. (2007). COURAGE Under Fire: On the Management of Stable Coronary Disease. J Am Coll Cardiol, 50(16), 1604-9.

[5] Cohen, G.L., Aronson, J., Steele, C.M. (2000). When Beliefs Yield to Evidence: Reducing Biased Evaluation by Affirming the Self. Personality and Social Psychology Bulletin, 26, 1151–64.

(Visited 515 times, 1 visits today)