Prober et al recently voiced their concerns of an over reliance on USMLE step 1 scores in determining residency posts in the States. They highlighted its predominantly scientific or clinical content, the fact that the more competitive specialties require higher scores, the added stress and anxiety faced by learners to score well, and the costs involved preparation wise (1). They call for a more holistic approach to selection to residency which incorporates not simply knowledge based scores, but assessment of more on the job attributes such as team working and professionalism. The authors also highlighted a need to provide merit for additional avenues such as research experience, leadership, and performance during rotations. I fully agree that selection into postgraduate training should not be reliant solely on scientific or clinical based knowledge and a plethora of additional attributes should be accredited. However the level to which the latter is accredited may prove problematic.
In the UK, entry into foundation training currently involves a combination of educational performance measures (EPM) comprising medical school performance, additional degrees, and publications as well as a situational judgement test focused on the professional attributes expected of a foundation doctor, comprising commitment to professionalism, coping with pressure, effective communication, patient focus and working effectively as part of a team (2).
Following formal implementation of the SJT in 2013 evidence confirmed its value as a reliable tool able to differentiate between applicants, with the majority of operational items being classified as good or moderate in terms of their psychometric properties (3). In 2014, it was noted however that coaching for the test led to an increased likelihood of scoring highly on items and as a result continually developing new item content was essential (4). Interestingly there were significant differences between performance in the test based on ethnicity, place of training, age and gender with females outperforming males, and white people outperforming black people or ethnic minorities (4).
In 2013, a total of 7770 applicants were asked to provide their reactions to the SJT. Only 52.5 % thought that the content seemed relevant to what they thought the role of a foundation doctor should be (vs 57.1% in 2014 and 56.5% in 2015) (3, 4, 5). 38.6% agreed or strongly agreed that the content of the SJT appeared to be fair to the foundation programme (vs 40.4% in 2014 and 40.6% in 2015), with 25.4% of applicants agreeing or strongly agreeing that the results of the SJT could help selectors to differentiate between weaker and stronger applicants (vs 26.1% in 2014 and 26.3% in 2015) (3,4,5). Students commented that it was less fair than the EPM with some applicants finding it difficult to put several correct options into ranking order. Perceived fairness of the SJT also seemed an issue in 2014 and 2015 (3,4,5). Currently the SJT and EPM are scored equally with a maximum of 50 points each available (2).
As yet longitudinal research studies are required to evaluate the extent to which SJTs effectively predict performance throughout the medical education pathway, from medical school admissions through to independent clinical practice, and beyond. And evidence has shown that SJTs have different predictive validity at different stages during medical education, training and practice (6). Furthermore I would argue it is unfair to assess candidates so prematurely without having sufficient on the job experience. We are all aware of differing rates of competency achievement during training and a desire to assess too soon could actually result in further detriment. Trainees need time to gain sufficient working exposure and develop on the job skills. If this was a simple task postgraduate training wouldn’t involve several plus years of intense specialization. I look forward to a more fair and balanced approach.
1. Prober at al. A Plea to Reassess the Role of United States Medical Licensing Examination Step 1 Scores in Residency Selection. Academic Medicine. August 2015.
2. UKFPO. Situational Judgement Test FAQs. 2015.
3. Patterson, F. Analysis of the Situational Judgement Test for Selection to the Foundation Programme 2013.
4. Patterson, F. Analysis of the Situational Judgement Test for Selection to the Foundation Programme 2014.
5. Patterson, F. Analysis of the Situational Judgement Test for Selection to the Foundation Programme 2015.
6. Patterson F et al. Situational judgement tests in medical education and training: Research, theory and practice: AMEE Guide No. 100. Medical Teacher 2015 1-15.
Neel Sharma graduated from the University of Manchester and did his internal medicine training at The Royal London Hospital and Guy’s and St Thomas’ NHS Foundation Trust. Currently he is a gastroenterology trainee based in Singapore.
Competing interests: None declared.