Competency based curricula have largely replaced purely knowledge-based curricula in medical education. As assessment of competency has become a seemingly endless task, the participants in medical education have often complained that learning and development has been reduced to a series of hoops to jump through or, even worse, a series of boxes to tick.
The development of clinical governance frameworks in the late 1990s formalised the involvement of trainee physicians in the process of clinical audit. Audit became mandated, and as such, became a box to tick. If one could not demonstrate an audit of some description (any really) then one could not progress.
As such, clinical audit is one of the more reviled duties undertaken by trainees (in their own time) as very often the information ‘uncovered’ is simply an explicit statement of an open secret. The time taken to prove an acknowledged reality is usually resented by the auditor, and the recipients of the news that their practice falls below expected standards aren’t usually overjoyed. The result of such projects commonly a list of recommendations, presented on the last week of an attachment, by a junior member of the team, that will be agreed by all, but actioned by no-one. (Only around 5% of audits ever make any difference to practice)
Quality Improvement projects have been lauded by many (me included) as an answer to the problems with clinical audit: the burden of data required to make changes is less, the measurements and standards can be set by the instigators, and can be flexible enough to actually be achieved, and the change process is embedded as a primary aim within the most common methodologies employed.
Having been adopted into many curricula, quality improvement is now suffering many of the same problems as clinical audit. The projects are usually carried out in trainee’s own time, but are a mandated part of training – leading to resentment. The subjects tackled tend to be huge (‘We need a new IT system – the current one is not fit for purpose’) or focused on another team’s practice (‘The radiology department need to be quicker at doing the tests we ask for…’) The doctors participating in a QI project often come with a solution in mind (‘We will just get a bit of data – do what they did at my last hospital – and then we’ll show an improvement’) without really understanding the problem in its current context.
Sadly the result is that some of the most powerful tools for driving change within organisations have been reduced to a ‘tick’ on an assessment sheet, and are done as last-minute efforts, to scrape through the next annual progression check.
This does not mean that audits are inherently useless, or that QI projects should be abandoned as a tool for engaging junior doctors in understanding how to improve clinical practice. What it means is that, if a job is worth doing, it is worth doing it properly…
To do a job properly, one must know what is required, and what the best tools for the job are. Not everything can be part of a QI project, and not everything needs auditing. A paper republished in this month’s PMJ is an excellent exploration of the different ways in which changes can be evaluated, and this can be reverse-engineered, allowing potential change agents to know if they are setting off down the wrong road. It also reminds us that there are more options for change efforts available than the simple ‘before and after’ audit, or the use of multiple PDSA cycles.
Audit and QI are not the only area where the adage of ‘doing a job properly’ applies – as I discussed recently, all of the assessments we use to monitor competency are well intended, and when used enthusiastically and correctly, can uncover unexpected learning from even the most mundane of clinical encounters. It is probably true that if something has been ‘reduced to a tick-box’ then someone thought that box was worth ticking at one point. By taking the time to understand the theory and background to where the box came from, we might find ourselves using the tools available to us properly, and learning something in the process.