Kieran Walsh: “Tests ain’t fair. Those that study have an unfair advantage.”

Assessment is a core component of medical education. Medical students must undergo continual examinations. Postgraduate trainees must pass their annual assessments. And fully qualified doctors must overcome the twin hurdles of appraisal and revalidation. Exams are like death and taxes—you can’t avoid them. Is there anything good to say about exams? Probably only that they force us to study. Although speaking in jest, Allan Dare Pearce was right when he says that “those that study have an unfair advantage.” So how can we use that unfair advantage to the wider benefit of learners and patients?

There is a growing interest in test-enhanced learning in medical education. There is a great deal of evidence that, when learners are tested, they remember more content than when they are not tested. This “testing effect” is stronger than studying more—undergoing a test results in better learning than more studying. There is also evidence that the way learners are tested can have a big influence on their learning. [1]

Interval testing is a good way of reinforcing learning. End-of-year exams encourage cramming of material at the end of the year. And this material is then quickly forgotten. By contrast interval testing at regular times throughout the year encourages continuous learning. Interval testing results in a range of testing effects.

There are pre-assessment effects whereby learners study more in advance of the test. Students will learn whatever it is that will help them pass. This puts examiners in a strong position. If you want learners to learn about the practical application of knowledge that will help their patients, then tell them that you are going to test them on it. And then test them on it.

There are true assessment effects—where students learn during the test. When I did the practical part of my postgraduate exams, I learned quite a lot about the patients and their diseases that I was tested on. Twenty years later I can remember them vividly. One man was a retired docker from Belfast with lung disease caused by asbestos. The intense concentration that the testing conditions encouraged means that I will never forget what I learned.

Lastly there are post-assessment effects—these may be the strongest.  If you get detailed feedback following a test, then this will help you to concentrate more on areas of weakness. All tests can be a form of learning needs assessment. It doesn’t matter if a test is formative or summative—it can still give the learner a picture of their strengths and weaknesses and what they should concentrate on for the future.

At BMJ Learning, we create short assessments to go with the learning resources. We also create assessments to go with educational articles from The BMJ. These assessments are made up of multiple choice questions (MCQs). Some people like MCQs and some people don’t. I think that the reason some people don’t like MCQs is that they are hard to write and so often poorly written. At BMJ Learning, we try to create valid and reliable questions that are about the practical application of knowledge that will benefit patients. We avoid ambiguity—which is a big pitfall when writing questions. My memory of exams at medical school is that the questions all looked like this one:

“The way to a man’s heart is through his

A aorta

B pulmonary arteries

C pulmonary veins

D stomach”

I spent the exam wondering what on earth the examiner was getting at. I am still not sure. I guess it depends. I really should have gone to more lectures.


  1. Larsen DP, Butler AC, Roediger III HL. Test‐enhanced learning in medical education. Medical education. 2008 Oct 1;42(10):959-66.

Kieran Walsh is clinical director of BMJ Learning and BMJ Best Practice. He is responsible for the editorial quality of both products. He has worked in the past as a hospital doctor—specialising in care of the elderly medicine and neurology.

Competing interests: Kieran Walsh works for BMJ, which produces the online clinical decision support tool BMJ Best Practice.