You don't need to be signed in to read BMJ Blogs, but you can register here to receive updates about other BMJ products and services via our site.

If a job’s worth doing…

13 Jul, 15 | by Toby Hillman

Cryptic clothing label

Image via WM Jas on Flickr

Competency based curricula have largely replaced purely knowledge-based curricula in medical education.  As assessment of competency has become a seemingly endless task, the participants in medical education have often complained that learning and development has been reduced to a series of hoops to jump through or, even worse, a series of boxes to tick.

The development of clinical governance frameworks in the late 1990s formalised the involvement of trainee physicians in the process of clinical audit.  Audit became mandated, and as such, became a box to tick.  If one could not demonstrate an audit of some description (any really) then one could not progress.

As such, clinical audit is one of the more reviled duties undertaken by trainees (in their own time) as very often the information ‘uncovered’ is simply an explicit statement of an open secret.  The time taken to prove an acknowledged reality is usually resented by the auditor, and the recipients of the news that their practice falls below expected standards aren’t usually overjoyed.  The result of such projects commonly a list of recommendations, presented on the last week of an attachment, by a junior member of the team, that will be agreed by all, but actioned by no-one. (Only around 5% of audits ever make any difference to practice)

Quality Improvement projects have been lauded by many (me included) as an answer to the problems with clinical audit:  the burden of data required to make changes is less, the measurements and standards can be set by the instigators, and can be flexible enough to actually be achieved, and the change process is embedded as a primary aim within the most common methodologies employed.

Having been adopted into many curricula, quality improvement is now suffering many of the same problems as clinical audit. The projects are usually carried out in trainee’s own time, but are a mandated part of training – leading to resentment. The subjects tackled tend to be huge (‘We need a new IT system – the current one is not fit for purpose’) or focused on another team’s practice (‘The radiology department need to be quicker at doing the tests we ask for…’)  The doctors participating in a QI project often come with a solution in mind (‘We will just get a bit of data – do what they did at my last hospital – and then we’ll show an improvement’) without really understanding the problem in its current context.

Sadly the result is that some of the most powerful tools for driving change within organisations have been reduced to a ‘tick’ on an assessment sheet, and are done as last-minute efforts, to scrape through the next annual progression check.

This does not mean that audits are inherently useless, or that QI projects should be abandoned as a tool for engaging junior doctors in understanding how to improve clinical practice.  What it means is that, if a job is worth doing, it is worth doing it properly…

To do a job properly, one must know what is required, and what the best tools for the job are.  Not everything can be part of a QI project, and not everything needs auditing.  A paper republished in this month’s PMJ is an excellent exploration of the different ways in which changes can be evaluated, and this can be reverse-engineered, allowing potential change agents to know if they are setting off down the wrong road.  It also reminds us that there are more options for change efforts available than the simple ‘before and after’ audit, or the use of multiple PDSA cycles.

Audit and QI are not the only area where the adage of ‘doing a job properly’ applies – as I discussed recently, all of the assessments we use to monitor competency are well intended, and when used enthusiastically and correctly, can uncover unexpected learning from even the most mundane of clinical encounters.  It is probably true that if something has been ‘reduced to a tick-box’ then someone thought that box was worth ticking at one point.  By taking the time to understand the theory and background to where the box came from, we might find ourselves using the tools available to us properly, and learning something in the process.

 

I am conflicted…are you?

12 Jun, 15 | by Toby Hillman

via Tambako on Flickr

via Tambako on Flickr

 

I am conflicted… and it is down to a couple of papers in this May’s PMJ that look at the development of a new tool for assessing the performance of trainees in a key medical task.

Most nights – or at least 2 a week – I spend a portion of my evening logging into the e-portfolio system for medical trainees, and try to fill in several online forms to reflect the practice and learning of doctors that I have worked with over the past few weeks.

There is an array of choices to make, and choosing the right assessment for each task can be a bit difficult – you must know your SLE from your WPBA, your Mini-CEX (pronounced ‘kehks’ to avoid worrying conversations) from your DOPS, and woe betide anyone who mistakes their MCR for an MSF, or a CBD for an ACAT.  By the way, none of these is made up.

I find it difficult to make time in the day to fill these forms in with the subject of them sitting alongside me, but I do try to make an effort to build at least one or two learning points into each form to make them more useful than just a tick in a box on a virtual piece of paper.

The conflict I have is that these forms often feel like soul-less, mechanistic hoops that trainees simply have to plough through to enable progression to the next level in the platform game that is a training career in medicine in the UK. Some days I would like nothing more than to ditch the whole enterprise, and head back to the good old days where apprentice medics would work alongside me, learn by osmosis and through trial and error.

However, there are other days when the format of an assessment, or the very fact that a trainee has demanded one provides the opportunity to frame a discussion around an event, an experience, or an interaction that requires more attention – where real learning can take place during a discourse about what went well, less than ideally and what could be improved for the future in someone’s practice.  At these times, I am grateful that I don’t have to make up an assessment on the spot, but there is a framework to formulate my feedback, provide a breakdown of areas to concentrate on, and direction for where to find help and resource to improve.

The papers that have provoked my feelings of conflict look at a project in the West Midlands to develop a tool for assessing trainee’s performance in conducting ward rounds in the paediatric department. One describes the creation of the tool, and the other looks at the reliability and practical use of the tool

The end product is a multi-source feedback tool that does what it says on the tin, and reliably so.  It has similarities to other assessments already in use, but crucially focusses on a narrow, but important and ubiquitous part of medical practice – the ward round.

The development of the tool started in response to a reaslisation that ward rounding is an essential skill, and yet is not usually assessed formally in training.  It is one of those tasks or set-piece rituals that is learned by osmosis.  I think there are other areas that are similarly neglected too… responding to conflict within the MDT, responding to angry patients or complaints, effective handover between shifts, debriefing after significant events – or even after every shift, chairing meetings, reporting to a committee and so on…

Should we, therefore have tools for each of these areas, with specific numbers required by trainees in each post, to demonstrate competence?  I can imagine the response if this suggestion were taken up wholeheartedly for each vital part of a consultant job that is not at present explicitly covered in a WPBA (workplace based assessment)

So no, if we don’t want to be over-burdened by assessments, and end up with a fully tick-boxed CV, we should therefore rely on the education methods of old… in those halcyon days of yore when registrars still knew everything, and would fledge into consultant form without having had to get anything ‘signed off’ on an e-portfolio, but would be vouched for in references and conversations over sherry.

Clearly neither of these scenarios could be considered perfect, but where do we draw the line.  As with targets in all industries – what gets measured gets done, but what gets measured is not always what ought to be measured.

As we become slightly more reductionist in our thinking about medical education, we risk hitting the target but missing the point as we try to encompass all that is important about being a senior clinician in formalised assessments – but I am also convinced that training in the good old days probably wouldn’t be up to the job of training senior physicians and surgeons for the modern world of healthcare – so I remain conflicted…

The tool the authors have developed looks promising, and I intend to use it to help registrars start thinking more objectively about how they conduct their ward rounds – and for myself to improve my practice, but I can’t help thinking that I might just miss something else if I only stick to the tools available to me in the eportfolio.

Service, safety and training – a tricky trio.

16 May, 15 | by Toby Hillman

The National Health Service is more than a health service, is is perhaps one of the biggest postgraduate universities in the world.  Within the corridors, operating theatres, and wards of the hospitals in the UK, healthcare professionals are learning.

They are taught by example every day, and increasingly are allocated time out of the service to learn at dedicated teaching days / seminars and courses.

This key role of the state-funded health service in the UK can sometimes be forgotten, or hidden away under the demands to provide a service to the sick and needy that are entering hospitals in ever-increasing numbers.  But ask patients who are in any of the teaching hospitals in the UK, and I am sure that they will be familiar with the request for a student to practice taking a history, or performing a clinical examination.   Alongside students, there are many more trainees of different levels of seniority who also ask permission to learn from patients: patients consent to procedures where a trainee will be carrying out the procedure, under the supervision of a colleague who is fully qualified.

This type of learning is essential to ensure that the next generation of doctors is suitably skilled and qualified to deal with the problems they are to encounter during their careers.  These procedures might be simple – like inserting a cannula, or a urinary catheter, or far more complex.

Recently there have been pressures on this style of training.  Opinions differ on the relative impact of each development, but the European Working Time Directive, competency based curricula, formalised workplace-based assessments and streamlining of the training career ladder have all affected how we train teh next generation of Consultants.

The increasing concern for patient safety, and the increasing awareness of patients about potential complications have resulted in less invasive procedures being carried out by general teams, but instead by specialists in more controlled environments – conferring undoubted benefits to the individual patient receiving the treatment.

This situation leaves us with a tension – trainees need to train, patients require a service, and patients need to be safe.  To train safely, trainees require willing patients, supervision, and opportunities to learn techniques in a safe, supervised environment. Increasing pressures on services have led to a situation where taking time off the ward to attend such opportunities seems beyond reach, and negatively impacts on the care of other patients within the same service.

BUT – emergencies happen, our trainees are usually the first on the scene, and will need skills usually developed in elective procedures to deal with the emergency confronting them.

So, in the modern world, are we balancing this tension – are we giving trainees the chances to develop the skills we expect of them, whilst ensuring the patients who kindly offer the opportunity to trainees to learn are safe – both electively and in the emergency setting?

A paper published recently online in the PMJ takes a look at this question in one area that sits right in the middle of this conundrum – the insertion of intercostal chest drains.

This core skill for general physicians is increasingly becoming the preserve of respiratory specialists, and even then, is becoming the preserve of sub-specialists.

The paper looked at attitudes, experience, and training in chest drain insertion.  The results are interesting, and pose very important questions for those who train general physicians, or any trainees where procedures are considered a core skill.

Overall, there was consensus that general medical registrars (general physicians) should be able to place chest drains, and that the procedure should not become a specialist only activity.

So – general medical trainees should be trained… but how much did they think was required?

Overall, trainees and consultants agreed that to be considered competent, an individual must place at least 5-10 chest drains, and to maintain this competency, must place 5-10 per year thereafter.

And… how did they do compared with their own standards?

Higher trainees (senior residents) who are most likely to be the ones called on to perform these procedures urgently had, in the main acquired the suggested number of drains to be called competent.

But only 5% of those who weren’t Respiratory trainees had been able to maintain their competency – as defined by their own standards.

So – as the authors conclude, chest drain insertion is a vital procedure for a service to be able to provide, but those we rely to provide this service – by their own admission, cannot maintain the necessary competence.

This is a worrying admission to make, and should ring alarm bells for those managing acute medical services, and those charged with the education of doctors within the university that is the NHS.

The solution will not be a quick fix, but it seems that the relationship between training, service and safety has changed in recent years.

This tripod is a tricky one to balance, but if one leg grows out of proportion to the others, something is bound to fall over…

Picture by RetSamys

The beauty of the written word?

21 Apr, 15 | by Toby Hillman

New Font "Doctor's Handwriting"

Of the essential skills for doctors, writing has to be up there as one of the most important.  Doctors writing has been the butt of many jokes ove the years – justifiably, and written prescriptions remain a significant source of error in hospitals up and down the land.

The medical notes are another area where the handwriting of doctors is often held up to scrutiny.  In days gone by, the registrar would be the doctor who registered the opinion of the consulting physician, and before that notes were kept mainly for the interest and records of the physician themselves – helping in the discovery of syndromes and new disease entities through meticulous observation, and synthesis of the salient features of cases over time.

And now – what are the medical notes now?  The medical notes are now no longer the preserve of physicians, with entries from the whole MDT contributing to the story of admission, recovery, discharge planning, and plans for the future.  But the written history remains a key part of the journey of every patient.  It is often in those initial observations, descriptions of symptoms and signs that they keys to a diagnosis can be found.

The changes in medical training models, staffing models, and working conditions have also increased the importance of the initial history, and the medical notes as a tool for communicating the key aspects of a case to the team that follows after.  Given that a patient may be cared for by a different doctor almost every day of their admission, written notes are more than ever the most reliable repository of clinical information.

For such a key part of the medical system, one might expect that there is a rigorous training scheme, with competencies, certificates, seminars and reflective practice to ensure that all the appropriate details are recorded, communicated and understood by those who come along afterwards to care for the patient and understand the complexities that make up every patient and their story. Unfortunately there is little work in the literature that supports the training of medical students in written communication.

A study published online for the PMJ looked at the development and evaluation of a scheme that aimed to tackle the lack of formal training in constructing written histories, and support trainers in the evaluation of medical students efforts at a ‘clerking’.

They developed a study with three arms – one of standard practice, one with additional training for students in communication, and a final arm with training in communication, combined with training for residents on how to give feedback using the RIME tool.  The combined intervention showed positive results with statistically significant improvement in clerking scores between the start and the end of the study.  There was also a correlation between good handwriting, and overall quality of the histories – a correlation that could have been one of the key messages of the paper.

In addition the approach that the authors took of not simply ‘educating’ students, but in fact, working to create an environment where useful feedback, using a consistent and routinely applied tool is a good lesson for anyone trying to improve educational interventions, and is an important lesson from this paper.

However, I think we need to look a little more critically at what we as a profession are trying to achieve with the education we offer our students.

I often think that we are training the doctors of the future for the hospitals of yesterday – and we know that a significant proportion of what we teach our students will be wrong by the time they enter the workplace

So when we look at the direction of movement in the medical world when it comes to written notes, perhaps we need to take a leap forwards in terms of what we are teaching our students.

A few years ago patients were hardly ever copied into their clinic letters – now it is accepted as the norm. This increasing access to medical records is gathering pace in a far more profound way in the US through the open notes movement, where physicians and patients share near equal access to the medical notes, changing the balance of consultations, and written records, and ensuring that patients can play a far more active role in the management of their illnesses.

The other transformation is from handwriting to typing, and in the near future, to voice recognition. Electronic health records are also transforming the skills required to be a physician.  No longer is a trusty fountain pen, and a good dictation speed the mark of a skilled communicator in the written form.  Now we shall have to be proficient at forms of communication that are more immediate, direct, and perhaps more open to misinterpretation (tone is notoriously difficult to convey in quick electronic communiqés.)

Training medical students in how to construct and communicate a history is vital, but we must keep in mind how our workplaces are changing, and how our communication is no longer necessarily directed to other medical professionals, but in fact towards the subject of the note.  This will require not only the skills encapsulated in Reporter, Interpreter, Manager, and Educator framework, but all of those that ensure a doctor can also be a partner in managing a chronic disease.

 

Observe, record, tabulate, communicate…

31 Mar, 15 | by Toby Hillman

Observe, record, tabulate, communicate.

© CEphoto, Uwe Aranas / , via Wikimedia Commons

When I was knee high to a grasshopper, I had a teacher that used to be incredibly irritating.  Instead of getting away with a lucky guess, or a grasp at a faded memory, we had to be able to ‘show our workings.’  This meant we had to understand where our answers came from, from first principles, and learning by rote wasn’t going to cut it.  At the time this was infuriating, and led to a whole load of extra work. However, now I realise that she had started me on a learning journey that continues on a daily basis.

This insistence on understanding the basis for an argument or fact has been a common feature amongst a number of my most inspiring tutors over the years since.

One particular tutor was Dr Alan Stevens. He was a pathologist at my medical school and was assigned to me in my first year as my tutor. Pathology made up quite a significant portion of the syllabus in our first years, and what a bore – hundreds of blobs of pink, blue, and occasionally fluorescent green or yellow. And all of these colours were swimming before my eyes in a lab that seemed a million miles from the wards where the ‘real’ work of a hospital was under way.

So when Dr Stevens took us out for a meal in the week before our yearly finals (another insistence that good wine and good company made for better performance than late nights cramming in an airless library – I still nearly believe this one) and he started to explain how pathology is the basis of knowledge of all disease, I was a little upset.  As with most medical students I was sure I knew best and knew what I wanted to learn so pathology remained one of those subjects that was somewhat neglected in my revision schedules.

However, once I hit the wards, I rued the day I forgot to ‘show my workings’.  As I encountered diseases I knew the names, and symptoms of, but had a sketchy understanding of the pathology or pathophysiology, I struggled from time to time with working out why a specific treatment might help, and how treatment decisions were being made.

A paper in this month’s PMJ may appear to be one of those that a casual reader would skip entirely owing to the title, or the description. A clinicopathological paper on fulminant amoebic colitis may not have immediate relevance to my work, but the paper is an example of how medical knowledge has expanded over the years;  a clinical question, borne out of experience is subjected to scientific examination and analysis, in an effort to move beyond the empirical approach to disease.

The paper looks at the clinical featues, pathological findings and outcomes of patients admitted to an 1800 bed tertiary care centre in Western India who underwent colectomy, and were diagnosed with amoebic colitis.  30 patients were included in the study, and the mortality rate was 57%.

Various features are explored – with some information flying in the face of traditional teaching.  For example, the the form of necrosis encountered in the study was not that traditionally associated with the disease – and could lead to a change in practice in the path lab – potentially allowing a more rapid diagnosis.(In the study the authors found basophilic dirty necrosis with neutrophil rich inflammatory exudate in the study population vs eosinophilic necrosis with little inflammation usually reported in textbooks)

The authors also pose some interesting questions in their conclusion regarding their observed increase in disease incidence – relating to many of the current woes in clinical medicine.

Overuse of medication is suggested as a contributing factor to the increased incidence of amoebic colitis. The authors postulate that indiscriminate use of antacid medications may be promoting the increased incidence of amoebic colitis by allowing ameobic cysts to survive transit through the stomach.  This mirrors some of the concerns about the (over)use of PPIs promoting c. diff infections in the UK.  In addition, lifestyle factors are suggested as contributory – a reduction in dietary fibre can increase colonic transit time, increasing opportunities for the amoebae to adhere to the bowel wall – and the organism itself may be changing in virulence.

So whilst I may not have learned a great deal that I will employ next time I am in clinic, this paper is a great example of the value of close observation over time of the population one serves, maintaining an enquiring mind about the pattern of disease encountered, and then subjecting such notions to scientific scrutiny – eliciting new knowledge, new questions for research, and returning this information to the clinical field to improve practice, and hopefully change outcomes for patients of the future. Osler would be proud.

 

 

Our caring profession

16 Mar, 15 | by Toby Hillman

Click to see artist's homepage

Anatomy of a Junior Doctor – Eoin Kelleher

 

The rigours of life as a junior doctor are well described, both in popular modern classics like House of God by Samuel Shem and the television series Scrubs, but also in lesser known works, like A Country Doctor’s Notebook by Mikhail Bulgakov.

There are common themes – imposter syndrome, fear of killing patients, bullying seniors, long hours, mental and physical exhaustion.

There is no doubt that working conditions have improved somewhat from those experienced by Bulgakov in revolutionary Russia, but the first months and years of clinical practice remain a difficult time.

A paper in the current issue of the PMJ looks at a group of junior doctors who face additional challenges, and examines their coping strategies.

Dyslexia is considered to have a global prevalence of around 6% and in the medical profession, the rates of declaration of dyslexia amongst medical students are around 2%, and rising.  The paper highlights the difficulties that people with dyslexia face, and the potential impacts these would have on doctors who have just entered into their professional roles.

All of the FY1 grade doctors in Scotland were asked if they would take part in the study, and 9 agreed.  This could represent about 40% of the junior doctors in Scotland who have dyslexia, so the study provides quite an insight into their experiences.

One question that interested me was if the subjects had disclosed their dyslexia to colleagues.  The report states that only a few had discolsed their dyslexia to colleagues.  The reasons for this were varied.

Some felt that to disclose a problem like dyslexia might be considered by others as ‘help-seeking’ or as an excuse for poor performance, that would mark them out as different from the ‘neuro-typical’ house officers, with the attendant problems this might produce.  Shame was a factor in some decisions not to disclose, and there was anxiety amongst the subjects about the impact of dyslexia on their future careers – owing to the difficulties with written exams, and subjects were aware that dyslexia could become a reason for bullying.

Only the minority had actually disclosed their dyslexia to others, and had seemed to have benefited – with a wider range of coping strategies available, particularly in troublesome settings like ward rounds, or presenting cases in MDTs. One subject had made use of a ‘buddy’ system for writing on ward rounds.

The issues that this paper highlights around disclosure of dyslexia throw up questions to us all about how we as a profession treat our colleagues – not only those with dyslexia, but anyone in our profession that might be suffering with an illness that is not immediately obvious.

My most recent blog tried to highlight that doctors remain humans, despite their attempts to control physiology, master illness and manipulate tissue. As such, we are at the mercy of the cognitive biases that have been discovered in other professional groups, but we also need to realise that we are at the mercy of our own biology just as much as those patients we try to help. And yet, as a profession we still take pride in being robust, if not indestructible, and the prevailing opinion is generally that admitting to an illness, or struggle is beyond the pale.  This is reflected in ubiquitous anecdotes about ‘never having had a day off sick in x years’ or ‘the only reason I got any treatment was because I clerked myself in.’

However, when studied objectively, residents in the US reported the feeling that there would be both empathy for colleagues who missed work through illness, and a concurrent risk of being ostracized from their peer group.  This tension reflects the both the caring nature of our profession, but also the seemingly excessive expectations we place on ourselves and our colleagues when it comes to stamina, and resilience.

I would not advocate moving to a world where the slightest hiccough sends us running for the duvet, but equally, if colleagues in one of the most stressful periods of their careers cannot turn to peers and supervisors for help for fear of being ostracised, then the hidden curriculum has swung the wrong way.

Still only human

13 Feb, 15 | by Toby Hillman

A perfect specimen?

There is something different about medics.  We stand out at university – often forming into a clique that others find difficult to fathom, break into, or tolerate.  We strive to be different in many ways; we learn a huge range of facts and figures, along with new languages ( we are taught about everything from the arachnoid mater to xanthelasma, via dysdiadochokinesia) and new ways of behaving – “Hello, my name is…. I’d like to examine your chest if I may?”

This difference has been reinforced over centuries, helped along by the formation of royal colleges, and more recently, by real successes in actually curing some diseases, and managing others so that hospitals are no longer feared as places of death, but instead as places of relative safety for those needing their services.

I think that this paper in the January edition of the PMJ may help to take us back to our roots a little.  The paper is a quality improvement report looking at the impact of a mnemonic device on the completeness of information recorded in the notes in a paediatric department.  The problem was that documentation was of a poor standard, impairing the investigation of complaints and incidents.  The solution used an acrostic to help junior doctors record the important aspects of care that are encompassed within the post-take round.

Results were impressive, showing an increase in completeness of the notes in areas that were previously neglected, including parental concerns, fluid prescriptions, nursing concerns, and investigations.  Understandably there was less increase in areas that had been previously well documented – the final plan, vital signs, presenting problems, and examination findings.

So we can see that, in a time-pressured, complex situation, the junior members of a team find that they are better able to record relevant information when following a set pattern of information recall / record for each case.  This is not perhaps a Nobel-worthy discovery, but it is an important contribution to the ongoing realisation in our profession that there are tools and techniques we can use to enhance our practice, and improve safety and outcomes of the processes we use in our daily work.

Many of the ‘new’ ideas in healthcare like LEAN, six sigma, crisis resource management, human factors training, pitstop handovers, checklists and so on have origins outside of medicine, and in other high-risk, high-reliability, or high value organisations.  The impact of these ideas though can be significant, and in some cases hospitals have been impressed enough to adopt philosophies from industry wholesale – notably the Royal Bolton Hospital.  The medical profession itself though is usually somewhat more reluctant to adopt these concepts, and apply them in practice.

The resistance to checklists, communication methods like SBAR, and other tools that seem to constrain clinical autonomy provides an interesting point to consider.  Is there something inherently wrong in encouraging medics to communicate or work in standardised ways?

Well, no. The ALS algorithm – much maligned by those who have to repeatedly take assessments and refresher courses using the same stock phrases, and act out scenarios that have an uncanny knack of ending in a cardiac arrest – has had great success.  Indeed, when you think of the teams that work in any hospital, the arrest team is one of the most efficient in terms of understanding  common purpose, using a common language, and following a set pattern of actions.  This process even works across language barriers as Dr Davies showed in this article.

And yet, there is always something uncomfortable about being asked to write / think / talk / communicate in a particular way as a medic.  Is this because we are somehow different from those other human beings working in complex, challenging environments?

My feeling is that perhaps we aren’t entirely to blame for our reluctance to adopt these ‘new’ ideas of working.  The hubris required to enter chaotic, painful, emotional situations, take control, decide on a decisive course of action, and do this within a very short space of time is bred into us from the point at which we decided to become doctors.  As I said at the start – we medics are different – and have been since we started on our journey to the positions we now hold.

And therein lies the rub. When it comes down to it, we aren’t really different from those we try to guide through the challenges of illnesses both acute, long-term and terminal. We have the same brains, same cognitive biases and same susceptibility to distraction, and therefore next time you are asked if you can follow an acrostic, use a checklist, or submit to a protocol – before rejecting the concepts out of hand, consider if you are doing so because the tool really isn’t fit for the job, or if you need to follow the advice of Naomi Campbell – don’t believe your own hype.

It’s good to talk…

28 Jan, 15 | by Toby Hillman

Image by Uberprutser via wikimedia commons

When I think about my work on the acute medical unit, or my clinics, it is almost mind boggling, the number of interactions I have with other humans – trainees, consultant colleagues, radiographers, radiologists, professionals from other hospitals, biochemists, nurses, physios, therapists, and of course – patients.  As Atul Gawande points out in this splendid article, medicine is now more about pit crews than cowboys, and this level of teamworking brings an inherent babble of communication.

The central point of all of this communication is to provide a service to patients – alleviating symptoms, diagnosing and curing disease, or helping patients to manage long term conditions. It would be incredibly difficult to do any of these core activities in healthcare without communicating effectively with patients.

A paper in the current issue of the PMJ reviews the literature relating to the assessment of communication skills within more senior postgraduate trainees (within two years of completion of training) and those who have already become established in practice.

The paper synthesises the evidence on assessment of communication skills, and draws the rather disappointing conclusion that currently there is little in the evidence to demonstrate benefit from educational initiatives, that there is no definitive, validated tool to evaluate communication skills, and that there is no defined standard of what constitutes good communication in the senior postgraduate, or consultant workforce.

The conclusion is disappointing from my point of view, as I consider communication to be such an important part of my day job; but when I think back to my own training, is really not all that surprising.

In my higher training I cannot think of one training session that used any of the methods reported in this paper to evaluate my communication skills.  However, if the evidence is so heterogenous, and there is no clear basis on which to build educational efforts to improve communication skills in senior clinicians, is there any indication that such training is even required?

If we stick to the published evidence on this front, a mixed picture emerges again, with two of the referenced papers indicating that communication skills increase with increasing experience, whilst two others showed that communication skills worsen with increasing time in postgraduate training.

But if we go outside the published evidence on communication assessments, and look more at the outcomes of healthcare, we see that deficiencies of communication play a major role in almost all categories of incident that resulted in death of permanent loss of function investigated by the Joint Commission (an accreditation body in the US.) The Joint Commission estimates that breakdowns or errors in communication contributed to over 50% of post-operative complications, around 2/3 of wrong-patient/wrong-site/wrong procedure events, and  70% of medication error events.

These events are not the well controlled OSCE style scenarios that are traditionally used to evaluate one-on-one communication skills, but are real-life incidents that will have involved all of the complexity of current healthcare provision. Communication in these areas include so much more than those areas traditionally concentrated on in training programmes.

Email, pager, telephone, written notes, electronic health records – post-it notes, all of these forms of communication are used in real life, and perhaps the reason for the heterogeneity of evidence about what makes good communication, and the lack of clear path to improved communication skills is that we aren’t really looking at all the right areas of communication.  Whilst using appropriate non-lexical utterances, empathetic questioning and establishing rapport with patients is very important, we perhaps also need to pay attention to the wider aspects of communication and start to improve outcomes and reduce the number of events where poor communication underpins the error.

There are some recommendations out there about closed loop communication techniques, standardised communication systems (eg SBAR) and other techniques to improve understanding within and across teams, many of which have their roots in the military and aviation industries. These are often resisted by medical practitioners, but as I sit here, watching 24 hours in A&E it is clear that in the critical pinchpoints of communication in medical emergencies, we have started to use more structured, team approaches to communication where the feedback from poor understanding can have an immediate and disastrous impact.

Whilst, as this systematic review shows, the evidence for improving communication skills in senior postgraduate trainees and consultants may be lacking in standardisation, and validation – the outcomes of poor communication are often plain to see.

There is undoubtedly a paucity of training around communication skills in the higher grades of training, and, just because there is an absence of evidence, we should not take this as evidence of an absence of benefit of paying attention to what is one of the core activities we all engage in every day.

 

 

I am curious… are you worth your salt?

7 Jan, 15 | by Toby Hillman

Photo by SoraZG on Flickr via Wikimedia Commons

Clinical curiosity is a key trait amongst learners, and in clinical practice, curiosity is necessary to reach a diagnosis of even the most simple nature, but particularly so to diagnose cases that do not readily fit the heuristics that one brings to bear in everyday clinical work.

However, clinical curiosity can be supressed by the requirements to learn a huge volume of ‘facts’, the time pressures of work, a culture where certainty is admired and rewarded, and uncertainty often frowned upon.  Indeed, being able to see a case from many different perspectives, rather than picking a line and sticking to it can be very inefficient, but curiosity is vital to ensure that the first diagnosis for any given presentation isn’t adhered to just because it was the first one made, but is constantly re-evaluated, and tested against new information as it is acquired.

These ruminations on the subject of curiosity were prompted by a chat with a colleague about her often random questions to me about a diverse range of medical subjects.  Her contention to a colleague was that curiosity should be the driving force in clinical medicine, to avoid clinicians becoming protocol-driven drones.

A recent paper in the PMJ also got me wondering a little bit about curiosity, and wondering if in fact we have lost a bit of this wonderful character trait in medicine, and left ourselves satisfied all to easily by diagnoses and treatments that seem right, but don’t quite cut the mustard?

The paper reports a retrospective observational study across three Trusts in London, examining the investigation and management of hyponatraemia in all patients in whom the condition was identified.  Laboratory data were monitoried to identify cases, and once 100 cases were identified, the study stopped. The seriousness of the condition of hyponatraemia was highlighted with an inpatient mortality rate of 16% ( I hasten to point our that there is no claim of causation) and 9% of the patients required ITU admission.

However, what was the response of medical teams at the three centres? Well, it could be described as a little disappointing – with a diagnosis recorded in the notes of only 42% of patients.  And these weren’t just low sodiums one might explain away; to be included in the study, the serum sodium had to be ≤128 mmol/L.

What was actually done for the patients?  To fully evaluate a patient with hyponatraemia and reach a rational diagnosis, and hence management plan, the authors considered that a full set of: volume status, paired serum and urine osmolalities, urinary sodium, thyroid function tests, and cortisol.  A complete work-up was performed in just 18% of patients across the three centres,

And the management – even if a diagnosis wasn’t achieved, what was acutally done?

37% of patients did not have any specific therapy at all, and predominantly patients received isotonic saline.  Cessation of potentially causative drugs was next most utilised therapy, and this was followed by fluid restrictions to various degress,

Treatment failure was recorded in 15% of those treated with isotonic saline, and 80% of patients undergoing fluid restriction, and 63% of patients were discharged with persisting hyponatraemia – and as the authors indicate, this is perhaps not surprising given the lack of diagnosis, and treatment in many cases.

So what is going on?  The most common electrolyte disturbance seen in hospitalised patients is easily diagnosed (try being admitted to hospital without having a U&E sent…) and yet is poorly investigated, diagnosed, and treated.  Is this a reflection of a lack of guidelines, education and therapeutic options as the authors suggest?

I would point out that a simple internet search on any smartphone or computer for ‘hyponatraemia algorithm’ will generate a few options of how to assess and manage patients with hyponatraemia – so availability of guidance wouldn’t necessarily be a major barrier.  However, I agree that there is perhaps not quite enough education on clinical chemistry in the medical curriculum.

But perhaps it is due to a diminution of the curiosity of clinicians – be that a result of the way we educate, train, or the efficiency we expect of our doctors that curbs the desire to seek the truth in complex cases, and leads to a satisfaction with first pass diagnoses rather than cradling diagnostic uncertainty, and going through the full work up that our patients need to manage their conditions.

 

 

 

The great game…

10 Dec, 14 | by Toby Hillman

The great game… Image via wikimedia commons. CC 2.0

The PMJ editors met recently, and it was a pleasure to meet up with a range of engaged, eloquent, educated and motivated individuals who all share a passion for Postgraduate Medical Education.  It was therefore a little bit of a surprise when a reference to an article on the gamification of medical education proved to be a little contentious.

My colleagues thought that gamification was not necessarily a ‘thing’ and that for the PMJ to publish a paper with such a term in the title might be a bit wayward.  However, fears were allayed by the fact that I had heard of gamification, and in fact it is a technique in learning that has been in recognised use in other fields for really quite some time.  There is an excellent “Do Lecture” from the 2009 lecture series on the subject, and within patient education, there is quite an industry dedicated to themed ‘games’ – from nutrition to disease management for example – from Channel 4 and from SurgerySquad.

Other than the lecture above, I also heard about ‘gamification’ of learning at a Society of Acute Medicine conference where a team from the Netherlands presented their simulation game – ABCDESim.  This is a serious game that allows players to gain skills and learning around resuscitation of the acutely unwell patient.

So there are real ‘games’ and their use in education has been examined in the educational literature – highlighting the engagement with subject matter that can be achieved through games, even if the longer term benefits of gaming within education are not fully defined.

The paper that raised an eyebrow analyses the effect of not so much a ‘game’ as the application of the principles of gamification – namely 1) voluntary participation
2) explicit rules of competition for each user
3) immediate feedback on performance
4) participation in competing teams
5) the ability improve in terms of rank (eg being awarded a badge or prize for specified achievements)

The game was really a bank of MCQs that addressed core knowledge expected of the residents on an internal medicine residency programme. The ‘play’ element of this was in the competition associated with answering the questions and comparing oneself, or ones team to the performance of others, and the ability to see real-time positions on a leaderboard, and earn badges for good performance and answering certain numbers of questions.

The researchers found that residents did engage well with the game, and were often found to be answering questions in their own time, that some of the techniques they employed to maintain motivation were well founded ( eg regular state of play emails, personalised leaderboards highlighting potential ‘opponents’ that could be overtaken with a few more questions and the earning of badges for good performance) and that there were qualitative and quantitative benefits – particularly with regards to retention of knowledge over time.

So it seems that millenials are open to the gamification of education.  And perhaps millenials are going to be the first generation whose minds have been changed by the internet.  Research from Columbia university in 2011 indicated that there could be a preference to recall where to find information, rather than actually retain the factual content.  This combination presents medical educators with an intriguing challenge – our younger colleagues are happy to engage with technology in novel ways to improve their education, but that very engagement with technology might be eroding what have been seen as key attributes of effective clinicians in the past.

However, how new are these features really?  The gamification of medical knowledge is hardly new.  Although the rules weren’t exactly software-derived, and universally applied, I can still recall my housemates jousting with medical facts as we approached finals – indeed, the only reason I recall the fluorescence of amyloid being apple green after staining with congo red is down to a housemate trying to ‘psych out the opposition’ on the morning of a medicals final paper.  The stimulus to learning such ‘games’ provided probably contributed to my success, and to a certain extent still does.  An older example is the teaching ward round when the consultant questions students in turn to tease out facts in ever increasing detail – ultimately reaching the registrar who traditionally answered with aplomb.

And the other feature of millenial learning – the ability to find knowledge, rather than retain or analyse it?  As we are now deep into advent, it is perhaps appropriate to turn to the motto of King William’s College Christmas Quiz:

Scire ubi aliquid invenire possis, ea demum maxima pars eruditionis est

“To know where you can find anything is, after all, the greatest part of erudition”

So the features of learning elicited in this study are certainly worth noting, and employing them to maintain interest, and enhance postgraduate education for the emerging generation of clinicians is important, but we shouldn’t be fooled that learning itself, or the competitive nature of learners has changed too much – history teaches us that medics have always been competitive, and that when it comes to knowledge seeking – our forefathers already knew that knowing everything wasn’t always the be all and end all – but knowing where to find out was almost as important.

Latest from Postgraduate Medical Journal

Latest from PMJ