You don't need to be signed in to read BMJ Blogs, but you can register here to receive updates about other BMJ products and services via our site.

It’s good to talk…

28 Jan, 15 | by Toby Hillman

Image by Uberprutser via wikimedia commons

When I think about my work on the acute medical unit, or my clinics, it is almost mind boggling, the number of interactions I have with other humans – trainees, consultant colleagues, radiographers, radiologists, professionals from other hospitals, biochemists, nurses, physios, therapists, and of course – patients.  As Atul Gawande points out in this splendid article, medicine is now more about pit crews than cowboys, and this level of teamworking brings an inherent babble of communication.

The central point of all of this communication is to provide a service to patients – alleviating symptoms, diagnosing and curing disease, or helping patients to manage long term conditions. It would be incredibly difficult to do any of these core activities in healthcare without communicating effectively with patients.

A paper in the current issue of the PMJ reviews the literature relating to the assessment of communication skills within more senior postgraduate trainees (within two years of completion of training) and those who have already become established in practice.

The paper synthesises the evidence on assessment of communication skills, and draws the rather disappointing conclusion that currently there is little in the evidence to demonstrate benefit from educational initiatives, that there is no definitive, validated tool to evaluate communication skills, and that there is no defined standard of what constitutes good communication in the senior postgraduate, or consultant workforce.

The conclusion is disappointing from my point of view, as I consider communication to be such an important part of my day job; but when I think back to my own training, is really not all that surprising.

In my higher training I cannot think of one training session that used any of the methods reported in this paper to evaluate my communication skills.  However, if the evidence is so heterogenous, and there is no clear basis on which to build educational efforts to improve communication skills in senior clinicians, is there any indication that such training is even required?

If we stick to the published evidence on this front, a mixed picture emerges again, with two of the referenced papers indicating that communication skills increase with increasing experience, whilst two others showed that communication skills worsen with increasing time in postgraduate training.

But if we go outside the published evidence on communication assessments, and look more at the outcomes of healthcare, we see that deficiencies of communication play a major role in almost all categories of incident that resulted in death of permanent loss of function investigated by the Joint Commission (an accreditation body in the US.) The Joint Commission estimates that breakdowns or errors in communication contributed to over 50% of post-operative complications, around 2/3 of wrong-patient/wrong-site/wrong procedure events, and  70% of medication error events.

These events are not the well controlled OSCE style scenarios that are traditionally used to evaluate one-on-one communication skills, but are real-life incidents that will have involved all of the complexity of current healthcare provision. Communication in these areas include so much more than those areas traditionally concentrated on in training programmes.

Email, pager, telephone, written notes, electronic health records – post-it notes, all of these forms of communication are used in real life, and perhaps the reason for the heterogeneity of evidence about what makes good communication, and the lack of clear path to improved communication skills is that we aren’t really looking at all the right areas of communication.  Whilst using appropriate non-lexical utterances, empathetic questioning and establishing rapport with patients is very important, we perhaps also need to pay attention to the wider aspects of communication and start to improve outcomes and reduce the number of events where poor communication underpins the error.

There are some recommendations out there about closed loop communication techniques, standardised communication systems (eg SBAR) and other techniques to improve understanding within and across teams, many of which have their roots in the military and aviation industries. These are often resisted by medical practitioners, but as I sit here, watching 24 hours in A&E it is clear that in the critical pinchpoints of communication in medical emergencies, we have started to use more structured, team approaches to communication where the feedback from poor understanding can have an immediate and disastrous impact.

Whilst, as this systematic review shows, the evidence for improving communication skills in senior postgraduate trainees and consultants may be lacking in standardisation, and validation – the outcomes of poor communication are often plain to see.

There is undoubtedly a paucity of training around communication skills in the higher grades of training, and, just because there is an absence of evidence, we should not take this as evidence of an absence of benefit of paying attention to what is one of the core activities we all engage in every day.



I am curious… are you worth your salt?

7 Jan, 15 | by Toby Hillman

Photo by SoraZG on Flickr via Wikimedia Commons

Clinical curiosity is a key trait amongst learners, and in clinical practice, curiosity is necessary to reach a diagnosis of even the most simple nature, but particularly so to diagnose cases that do not readily fit the heuristics that one brings to bear in everyday clinical work.

However, clinical curiosity can be supressed by the requirements to learn a huge volume of ‘facts’, the time pressures of work, a culture where certainty is admired and rewarded, and uncertainty often frowned upon.  Indeed, being able to see a case from many different perspectives, rather than picking a line and sticking to it can be very inefficient, but curiosity is vital to ensure that the first diagnosis for any given presentation isn’t adhered to just because it was the first one made, but is constantly re-evaluated, and tested against new information as it is acquired.

These ruminations on the subject of curiosity were prompted by a chat with a colleague about her often random questions to me about a diverse range of medical subjects.  Her contention to a colleague was that curiosity should be the driving force in clinical medicine, to avoid clinicians becoming protocol-driven drones.

A recent paper in the PMJ also got me wondering a little bit about curiosity, and wondering if in fact we have lost a bit of this wonderful character trait in medicine, and left ourselves satisfied all to easily by diagnoses and treatments that seem right, but don’t quite cut the mustard?

The paper reports a retrospective observational study across three Trusts in London, examining the investigation and management of hyponatraemia in all patients in whom the condition was identified.  Laboratory data were monitoried to identify cases, and once 100 cases were identified, the study stopped. The seriousness of the condition of hyponatraemia was highlighted with an inpatient mortality rate of 16% ( I hasten to point our that there is no claim of causation) and 9% of the patients required ITU admission.

However, what was the response of medical teams at the three centres? Well, it could be described as a little disappointing – with a diagnosis recorded in the notes of only 42% of patients.  And these weren’t just low sodiums one might explain away; to be included in the study, the serum sodium had to be ≤128 mmol/L.

What was actually done for the patients?  To fully evaluate a patient with hyponatraemia and reach a rational diagnosis, and hence management plan, the authors considered that a full set of: volume status, paired serum and urine osmolalities, urinary sodium, thyroid function tests, and cortisol.  A complete work-up was performed in just 18% of patients across the three centres,

And the management – even if a diagnosis wasn’t achieved, what was acutally done?

37% of patients did not have any specific therapy at all, and predominantly patients received isotonic saline.  Cessation of potentially causative drugs was next most utilised therapy, and this was followed by fluid restrictions to various degress,

Treatment failure was recorded in 15% of those treated with isotonic saline, and 80% of patients undergoing fluid restriction, and 63% of patients were discharged with persisting hyponatraemia – and as the authors indicate, this is perhaps not surprising given the lack of diagnosis, and treatment in many cases.

So what is going on?  The most common electrolyte disturbance seen in hospitalised patients is easily diagnosed (try being admitted to hospital without having a U&E sent…) and yet is poorly investigated, diagnosed, and treated.  Is this a reflection of a lack of guidelines, education and therapeutic options as the authors suggest?

I would point out that a simple internet search on any smartphone or computer for ‘hyponatraemia algorithm’ will generate a few options of how to assess and manage patients with hyponatraemia – so availability of guidance wouldn’t necessarily be a major barrier.  However, I agree that there is perhaps not quite enough education on clinical chemistry in the medical curriculum.

But perhaps it is due to a diminution of the curiosity of clinicians – be that a result of the way we educate, train, or the efficiency we expect of our doctors that curbs the desire to seek the truth in complex cases, and leads to a satisfaction with first pass diagnoses rather than cradling diagnostic uncertainty, and going through the full work up that our patients need to manage their conditions.




The great game…

10 Dec, 14 | by Toby Hillman

The great game… Image via wikimedia commons. CC 2.0

The PMJ editors met recently, and it was a pleasure to meet up with a range of engaged, eloquent, educated and motivated individuals who all share a passion for Postgraduate Medical Education.  It was therefore a little bit of a surprise when a reference to an article on the gamification of medical education proved to be a little contentious.

My colleagues thought that gamification was not necessarily a ‘thing’ and that for the PMJ to publish a paper with such a term in the title might be a bit wayward.  However, fears were allayed by the fact that I had heard of gamification, and in fact it is a technique in learning that has been in recognised use in other fields for really quite some time.  There is an excellent “Do Lecture” from the 2009 lecture series on the subject, and within patient education, there is quite an industry dedicated to themed ‘games’ – from nutrition to disease management for example – from Channel 4 and from SurgerySquad.

Other than the lecture above, I also heard about ‘gamification’ of learning at a Society of Acute Medicine conference where a team from the Netherlands presented their simulation game – ABCDESim.  This is a serious game that allows players to gain skills and learning around resuscitation of the acutely unwell patient.

So there are real ‘games’ and their use in education has been examined in the educational literature – highlighting the engagement with subject matter that can be achieved through games, even if the longer term benefits of gaming within education are not fully defined.

The paper that raised an eyebrow analyses the effect of not so much a ‘game’ as the application of the principles of gamification – namely 1) voluntary participation
2) explicit rules of competition for each user
3) immediate feedback on performance
4) participation in competing teams
5) the ability improve in terms of rank (eg being awarded a badge or prize for specified achievements)

The game was really a bank of MCQs that addressed core knowledge expected of the residents on an internal medicine residency programme. The ‘play’ element of this was in the competition associated with answering the questions and comparing oneself, or ones team to the performance of others, and the ability to see real-time positions on a leaderboard, and earn badges for good performance and answering certain numbers of questions.

The researchers found that residents did engage well with the game, and were often found to be answering questions in their own time, that some of the techniques they employed to maintain motivation were well founded ( eg regular state of play emails, personalised leaderboards highlighting potential ‘opponents’ that could be overtaken with a few more questions and the earning of badges for good performance) and that there were qualitative and quantitative benefits – particularly with regards to retention of knowledge over time.

So it seems that millenials are open to the gamification of education.  And perhaps millenials are going to be the first generation whose minds have been changed by the internet.  Research from Columbia university in 2011 indicated that there could be a preference to recall where to find information, rather than actually retain the factual content.  This combination presents medical educators with an intriguing challenge – our younger colleagues are happy to engage with technology in novel ways to improve their education, but that very engagement with technology might be eroding what have been seen as key attributes of effective clinicians in the past.

However, how new are these features really?  The gamification of medical knowledge is hardly new.  Although the rules weren’t exactly software-derived, and universally applied, I can still recall my housemates jousting with medical facts as we approached finals – indeed, the only reason I recall the fluorescence of amyloid being apple green after staining with congo red is down to a housemate trying to ‘psych out the opposition’ on the morning of a medicals final paper.  The stimulus to learning such ‘games’ provided probably contributed to my success, and to a certain extent still does.  An older example is the teaching ward round when the consultant questions students in turn to tease out facts in ever increasing detail – ultimately reaching the registrar who traditionally answered with aplomb.

And the other feature of millenial learning – the ability to find knowledge, rather than retain or analyse it?  As we are now deep into advent, it is perhaps appropriate to turn to the motto of King William’s College Christmas Quiz:

Scire ubi aliquid invenire possis, ea demum maxima pars eruditionis est

“To know where you can find anything is, after all, the greatest part of erudition”

So the features of learning elicited in this study are certainly worth noting, and employing them to maintain interest, and enhance postgraduate education for the emerging generation of clinicians is important, but we shouldn’t be fooled that learning itself, or the competitive nature of learners has changed too much – history teaches us that medics have always been competitive, and that when it comes to knowledge seeking – our forefathers already knew that knowing everything wasn’t always the be all and end all – but knowing where to find out was almost as important.

I took the road less traveled by…

23 Nov, 14 | by Toby Hillman

I took the road less traveled by And that has made all the difference. aia.fernandez111 / CC BY-SA 2.0

Picture the scene – it’s the wee small hours, say around 0330, when the energy really ebbs on a night shift – it is still pitch black and the gentle lightening in the east is still at least a couple of hours away. You’ve been on the go since you started your shift at 2030 the night before. The last patient with chest pain has settled nicely with the gaviscon you prescribed, and you are heading back to the team office for a well deserved sit-down.

The vending machine starts calling at you from down the corridor – a bright light – like a guiding star, constant, ever present – a reassuring island in the maelstrom of a night shift. The bright colours seem to warm you as you approach, and the chocolate offers the prospect of an immediate relief from the doldrums of the night shift, a swift rush of dopamine, with just the right amount of caffeine to get the shift back on track. And anyway, calories on call don’t count, right?

A recent editorial in the PMJ sets out the argument for a greater degree of control over the context in which NHS employees make choices about the food that they eat when they are at work – and how this could have wider benefits to society as NHS workers become advocates for improved diet in their communities.

This proposal is a public health intervention on a bold scale. As Malhotra indicates in the article, effective public health measures, particularly related to perceived choices in lifestyle, are often directed not only at educating the individual to empower them to make better choices, but by altering the context in which those choices are made. That is, move from an obesogenic food environment to a salutogenic environment that positively encourages healthy choices.  This proposal is audacious in view of the powerful compaines that have so much to lose should healthy choices start to become the norm.

Prominent libertarians often protest against public health interventions that seem to curb the choices of individuals – indeed this central to libertarian philosophy… so how much choice does the individual above really have when it comes to what they are going to eat to get through the shift and carry on delivering care over the next few hours? And how much has this choice already been made for them? – the canteen is shut, the crash bleep chains the subject to the hospital grounds, and Abel and Cole don’t do late night take out. The choices really are limited.

But is the consumption of a high sugar, high salt diet the only arena where an illusion of choice exists in medicine?

It may have been an unlucky stretch recently, but of late, I have noticed a few other arenas where the medical profession might be pedalling a ‘choice’ but really are presenting more of a Hobson’s choice.  I have met, and heard of patients who, having looked at the options, weighed up their beliefs, and opinions on the value of a course of treatment, and opted for supportive, rather than disease specific care – both early on in the course of a disease, and in the latter, more desperate stages.

As a result, some of these patients have appeared to be cut off from their treating teams, and left to generalists, to deliver appropriate, but not expert care.  And what have these patients done, except exercise their choice – more insistently and bravely than we do daily when faced with some of the more mundane choices of life in the 21st Century Western Society we inhabit? And so, for swimming against the current, and declining to go along with the conventional rounds of treatments, and escalations to ever more invasive therapies, these patients seem to somehow be treated as if they have personally rejected the physicians offering them, and are therefore offered a cold shoulder.

But as a profession we recognise that the evidence is there that outcomes can be better with less treatment, and that the well informed often take a more conservative approach to management at the end of life.

So whilst I agree that we should support efforts to improve the ability of individuals to make sensible healthy choices about their diets – and any change in the food landscape that makes these choices less one-sided would be welcome…  We must also hold these arguments up to our profession and the ways in which we both propose courses of treatment, and how we react to the choices patients make.

We should not be found guilty of skewing these decisions through a sense of altruism that tends towards paternalism, but instead should ensure that patients have the opportunity to make truly informed choices, and after they have made them, make certain that such pastoral and medical support is available to them as would be had they chosen another option.

Uncomfortable truths.

2 Nov, 14 | by Toby Hillman

Simulation is an educational tool that is almost ubiquitous in postgraduate medical training – with diverse examples of implementation – ranging from video recording of consultations with actors, to full immersion scenarios allowing trainees to test their skills and mettle in managing medical emergencies.  Indeed, it is so established in some fields that there are contests to show off medical skills being practised under pressure to draw out lessons. SIMwars plays out at the SMACC conference each year to great fanfare.

But what if you aren’t planning to demonstrate the perfect RSI on stage or in a video for dissemination around the world? What if you are just doing your day job – how would you feel, and would it be any use to suddenly find Resusci Annie in the side room you really need for a patient with c-diff, and be expected to resuscitate her from a life-threatening condition?

A paper in the current issue of the PMJ looks at just this – the perception and impact of unannounced simulation events on a labour ward in Denmark.

The research team had planned to carry out 10 in situ simulations (ISS), but only managed 5 owing to workload issues in the target department.  The response rate to questionnaires before and after the ISS events was strong.  Within the questionnaire were items concerning the experience of participating in an unannounced ISS – namely the perceived unpleasantness of taking part in an unannounced ISS, and anxiety about participation in the same.

One third of the respondents reported that, even after participating in an unannounced ISS, they found the experience stressful and unpleasant, however, 75% of them reported that participating in an ISS would prepare them better for future real-life emergencies.  The corresponding numbers for non-participants were a third thinking the experience would be stressful and unpleasant, but interestingly – only a third thought participating would be beneficial to them.

These results made me think about the experience of learning, and if the experience is ever relaxing and pleasant if it is truly effective?

I can’t think of many learning environments where I have felt completely at ease providing me with really deep learning, food for thought, or opportunities for development.  Indeed – a great number of my most profound learning experiences, that have taught me lessons I carry with me today – have been truly unpleasant.  These rich educational experiences tend to have involved challenge – requiring justification for a course of action, or the challenge of making a decision that is later to be judged through clinical outcomes, or a challenge to my strongly held beliefs – requiring an exploration of opinions, morals or prejudices.

Now, not all of these experiences have been publicly unpleasant – observed by others, but all have been relatively uncomfortable in different ways.  And perhaps this is key to deep learning – that it requires examination, challenge and reflection, not just sitting passively in a lecture theatre being told facts, or actions to take in a particular scenario.

So when we look at educational interventions employed in postgraduate medical education nowadays, have we lost a little of the challenge that used to be such a prominent part of the ‘learning by humiliation’ approach? We perhaps don’t need to return to the days of Sir Launcelott Spratt.

But equally we shouldn’t shy too far away from the idea that learners require a degree of challenge, discomfort, and even unpleasantness to gain insights into how their knowledge is being put into action, and it is far better to receive that challenge within the simulated environment than to have to face those challenges in real life, without the chance to re-run if things don’t go so well.



#SoMe and #MedEd – don’t forget to head for the bed

12 Oct, 14 | by Toby Hillman


Medical education is a major concern of the Postgraduate Medical Journal.  Indeed the origins of the journal are in the need to provide medical graduates with a source of education after graduation that would keep them in touch with the goings on in the major centres of medical progress.  A paper in the current issue of the journal highlights one of the more cutting edge aspects of medical education today.  Social media is a huge, huge resource of information, interaction, teaching, debate and conversation about medicine, and medical education that is starting to influence the medical world in powerful ways.

From the social movements to improve care like NHS Change Day or the #hellomynameis campaign, or the fantastic blog with up to date critique of new evidence at St Emlyn’s to the full blown, immersive experiences of a conference like SMACC, it is clear that social media are here to stay and will play an increasing role in how medical education is delivered and backed up.  The paper explores some of the debate around social education in medicine, and encourages educators not to view social media as a way to avoid engaging with students and in particular, readers are reminded that a cornerstone of medical education  remains very much in the real world: the face to face encounter with a patient.

If you have come to this blog via a route that doesn’t involve social media of some sort – then you should definitely read, and explore the resources mentioned by the authors, and those I have highlighted above. But if you – and I think most reaching this page are – familiar with SoMe – it would be worth thinking a bit more about that very intimate time – the patient encounter.

At our hospital we have recently had our first year clinical students go through a series of first experiences – first day on the wards, first examination of a patient’s respiratory system / abdomen / cardiovascular system and so on…

One of my colleagues had to borrow my stethoscope to demonstrate examination skills that he no longer uses on a regular basis… and the familiar debate about why we are teaching medical students inaccurate techniques to try to elicit signs that were often first described in end stage disease before modern medicine had really got going.  Des Spence put forward this view in 2012 – and lots of heat was generated thereafter.

My feeling on the subject – especially in the context of medical education, is that the acutal examination skills – while important ( the dawning of realisation among a group of medical students when discussing the physics of the generation of breath sounds – the pathology underlying pneumonia and pleural effusion – and hence the clinical signs just observed always give me a buzz) are probably secondary to all of the other learning that goes on in that encounter – the meta learning if you like.

The act of examining the respiratory system, for example is very intimate compared with everyday life.  ‘Just remove your shirt sir’ ‘May I see your hands?’ ‘I’m just going to feel in your neck – it may be uncomfortable’ ‘I’m just going to tap you on your chest’   Seriously? In what other setting could you do this without serious consequences?  To be able to inspire confidence, gain trust, demonstrate caring, understanding, and yet reach even a vaguely logical conclusion from the information gained from poking and prodding another human is quite something, and develops far more than just medical knowledge.

You may learn nothing from examining the chest that can’t be garnered in other ways, you may get more from an echocardiogram than you will ever get from examining endlessly for a slow rising pulse BUT the process of meeting hundreds of patients, gently asking them to bend to your touch, follow your instructions, and the very act of connecting physically with many different people from many different walks of life, I am convinced, play a huge role in instilling the values that underpin medical practice.

The knowledge that underneath all the trappings of life we are all contained within the same fragile human frame, the human connection made through physical touch, and the wild variation in styles of communication that are neccessary to achieve any sort of progress in a consultation – not just turn out some well worn lines…  All of these cannot be achieved by simply filling in a request for a chest x-ray or a CT scan.

Physical examination – like textbooks of old – may seem like a relic of another age when compared with current trends in medical technology and diagnosis, but, the unrecognised learning from these encounters is hugely valuable, and we should be careful to ensure that in our rush to get technology into every aspect of medicine, that we don’t ignore the greatest resource we have for learning medicine – as @SirBill would have had it – we still need to head for the bedside.


Professionalism – a team game

29 Sep, 14 | by Toby Hillman

Frm LibAmanda on Flickr. CC by 2.0

Professionalism is one of those peristent themes that run through medical education, and through the comments that are passed whenever there are concerns about clinical performance – be that the perceived clock watching engendered by the EWTD, or the failings at Mid Staffs.

Very often the term is used to highlight either a failing, or an upstanding quality in an individual and when I think of examples of high levels of professionalism, I tend to think of individuals and their reactions to particular situations, how they conduct themselves, or their dedication or discipline within the workplace – be that on the international sporting arena, at work, or in the headlines for other reasons.

What I hadn’t really considered (and perhaps this is a failing unique to me) is that professionalism is a team game.  However, I am having my eyes opened to the concept of professionalism as more of an active team game.  A paper published in the current PMJ discusses the results of an experimental series of 90 minute group discussions about professional matters in a safe environment or ‘legitimate space’ where talk of professionalism was deemed to be valid.

The paper is an exploration of the themes that the discussion groups generated over the course of 6 months, and their impact on the participants.  Key findings the authors draw out of the data are that the ‘storying’ of experiences related to professionalism within a legitimate space may help to foster professionalism within organisations, that the act of discussing the nature of professionalism can encourage the development a form of professionalism that considers not just the individual, but the team, work, and culture of an organisation, and that simply having a group to focus on professionalism enables discussion and learning about the subject that simply isn’t possible the the normal routine of daily work.

The ideas that caught my imagination within this paper though, were those of professionalism being a collective practice.  This may seem to be so obvious as to not warrant comment, but I think a little further consideration is due.  There is an interesting tension within the common definition of professionalism as listed in dictionaries eg:  “The competence or skill expected of a professional.”   that is – the expectation of a standard of behaviour defined, or set by a group, but expected of the individual.

This paper highlights the success of the intervention to foster a feeling that professionalism can be more than just individual actions/conduct, but is a collective venture. Professionalism is one of those ideas that when a group comes together, and discusses it intently, can glow brightly like coals in a fire, but when the individuals are taken out of that context, it falls to the background as the slightly nebulous concept that characterises a certain approach to situations, and the glow fades a little.  By bringing the implicit presence of professionalism into the legitimate space created by these groups, the concept of professionalism becomes more valid, and the trials and tribulations that everyone faces on a day to day basis can be used to learn lessons, share experiences and plan for the future.

The connections between individuals that are generated by this recognition of a professional basis to practice are could well hold the key to starting to change the cultures of organisations – the professionalism of one individual is a stimulant to professional behaviours and attitudes in others, and so the ripples continue ( I have blogged elsewhere on positive conversations in organisations.)

So what next – what will this do for me and my practice?  I don’t necessarily have the resource and time to start up a series of group events to foster professionalism within my teams, but I have been reminded of my potential impact as a role model for junior members of my team, the positive and negative ripples that I can generate, the interconnectedness of modern medical practice, and the need to sometimes bring slightly hidden concepts of professionalism to the fore – as the main subject for discussion.  With this, I hope I will be more effective in developing and fostering professionalism within my sphere of influence.

Too much medicine…

10 Sep, 14 | by Toby Hillman



A famous quote from the eminnet paediatrician Sir Cyril Chantler was published in the BMJ in 1998:

“Medicine used to be simple, ineffective, and relatively safe. It is now complex, effective, and potentially dangerous.”

As medicine progresses, it is worth keeping this in mind.  The complexity of modern medicine is one of the challenges that has led to a deal of dissolusionment with with Evidence Based Medicine movement – and the recent calls for renewal of the principles behind EBM from Trish Greenhalgh and colleagues highlights the importance of relating evidence to the individual being cared for, rather than just the guidelines that relate to the ‘perfect’ patient.

A paper recently published in the PMJ on the risk factors and features of non-variceal upper GI bleeding in inpatients, and its relation to antithrombotic drugs made me think again about my own practice, and probably the practice of a great many colleagues of mine up and down the country.

The paper examined cases of NVGIB at the University Hospital Crosshouse in South West Scotland.  The investigators looked at all cases of NVGIB in their hospital over a period of 12 months, to understand the risk factors associated with this condition, and in particular the role that antithrombotic drugs play.  The investigators split the patients into two groups – those developing NVGIB as inpatients, and those presenting to hospital with bleeding symptoms and signs.  The data were collected as part of an ongoing prospective examination of the epidemiology and management of upper GI bleeding.

The two groups showed some interesting differences – those developing bleeds as inpatients tended to be older, more likely to be female, were on more antithrombotic medication (particularly non-aspirin drugs), had more cardiovascular disease, and have higher Rockall scores than those presenting to hospital with bleeding.

The authors conclude that secondary care physicians looking after the older female population that suffers with cardiovascular disease should consider more strongly the need for prophylactic anti-ulcer therapy.

This advice would seem to be borne out by the evidence, and is a practical solution.  The paper did not examine the appropriateness of the use of anti-thrombotics in the first place – it would probably be beyond the scope of an observational study such as this.

However, as I read the paper and the conclusion – that more medicine is probably where the answer to this conundrum lies, I wondered how many of these elderly ladies derived significant benefit from the additional anti-thrombotic medicines they were prescribed.  This is pure supposition, but I wonder how many were given their new drugs in response to an admitting complaint that perhaps didn’t completely justify the use of powerful, complex, dangerous medicines?

I can easily imagine a patient presenting with some atypical sounding chest pain, some breathlessness accompanying it, who is written up for “ACS protocol” medications on admission, and spends a little time awaiting investigations to rule in or out significant cardiac disease.  After a couple of days the patient may develop their bleeding complication, and on the story goes.  The patient has probably had great protocolised medicine, and has had their risk factors assessed, and their symtpoms noted and reacted to, but perhaps their whole situation hasn’t been weighed up.  For example, the application of the “ACS protocol” to patients who don’t fit the evidence base (eg those with a history, but without ECG changes or enzyme elevation were excluded from the CURE trial after the first 3000 patients) may not be great evidence based medicine – but it is often a protocol applied to patients presenting with cardiac sounding chest pain to the acute medical unit, prior to the full information, and therefore full estimation of benefits and harms can be considered.

When we then consider the solution to this conundrum seems to be to add in further medications to offset the harms of those potentially initiated on a less than optimal basis, I wonder if we aren’t just ending up chasing our tails.

Maybe we need to come back to Sir Cyril again, and finish off his quote:

“Medicine used to be simple, ineffective, and relatively safe. It is now complex, effective, and potentially dangerous. The mystical authority of the doctor used to be essential for practice. Now we need to be open and work in partnership with our colleagues in health care and with our patients.”

It is the being open, and working in partnership with our patients that will deliver the better results.  Lets be honest – if a story doesn’t sound quite like a high-risk ACS then perhaps we could wait a bit for the evidence to back up our proposed management plan, and avoid overtreating, over medicating, and harming those at highest risk of both ‘natural’ and ‘iatrogenic’ disease.





NVGI Bleeding

Risks identified – additional drugs

Solution proposed – more drugs

Link to the NOACs and the bleeding risks associated.

Importance of balance.

Is this a cognitive bias that needs addressing – always adding.


Are you safely socialised?

26 Aug, 14 | by Toby Hillman

Social Animals

Social Animals

Changes in role within the medical profession are times of great upheaval.  One of the most challenging is the change from being a medical student to a fully qualified doctor.  A cohort of medical students qualifies every year around June/July time, and members of this cohort take their first steps on the wards and in clinics as junior doctors each August.  Recent guidance has enforced a period of shadowing and induction for all newly qualified junior doctors in the UK before they start their first jobs – in recognition of the fact that schemes with targeted teaching and shadowing can reduce safety incidents by a significant margin.

At other points in the medical hierarchy, there tends to be less focus on the doctors changing from one grade to another.  However, at Consultant level, the stakes rise, and organisations often spend a little more time considering how to smooth the transition from trainee to fully independent practitioner.  Mentoring – a two way learning process between a senior and junior member of a team, organisation, or even healthcare system – is a concept that many organisations have identified as being beneficial to new consultants, and mentoring programmes exist in a fair number of hospitals.  Like many relationship-based exercises, there is a deal of trial and error involved, and mentoring relationships don’t always work out perfectly.  A paper in the PMJ recently examined what makes mentoring work for new consultants.

The authors interviewed new consultants and senior leaders within acute hospitals in the Yorkshire and Humber region of England and through thematic analysis, six major themes were identified.  These included the protective nature of mentoring – both protective of patients under the care of new consultants, and of the consultants themselves; the mechanics of the process of mentoring (variability in expectations, informal and multiple mentors, the importance of personality in the mentoring relationship) and the prominence of mentoring as part of professional identity.

This last point struck me, and led me to wonder about how different specialties socialise trainees, both in their approach to interpersonal relationships at consultant level, and potentially to much wider aspects of care.

Professional socialisation is a fascinating concept – it has been studied in a diverse range of professions – from the clergy to the military – and within the medical world, plays a huge role in setting the culture of different departments, and probably specialties.  One teaching hospital training scheme I know of had a throwaway line at the back of the trainee handbook that spoke volumes about the culture of the specialty: ‘remember, you are an xxxxx-ist : keep it cocky!”  This encapsulated perfectly the culture of the trainees in that particular specialty within the region, and I now recognise this as one part of what is commonly held to be the ‘Hidden Curriculum’ of medical education.

The paper examining mentoring schemes mentions that three specialties in particular may lend themselves to more natural mentoring relationships – surgical specialties in general, gastroenterolgy, and anaesthetics.  I wonder if the craft nature of these specialties demands a closer supervision during training – where consultants are less willing to let trainees gain experience on their patients unsupervised and therefore engage in more hands-on training, engendering close working relationships? Or perhaps it is less high-brow than this, and the downtime between cases in these procedure and list-based specialties offer the opportunity for trainees and seniors to develop more meaningful relationships than in other specialties where the clinic room, or set-piece ward round is the main arena of interaction – affording less opportunity for relationship building chats and debates.

So, if certain specialties prepare new consultants better for mentoring relationships, and mentoring is thought to be a positive influence on patient and employee safety, do some specialties socialise their workforce to be unsafe, to reject a collegiate approach to work, and impair the personal development of their practitioners?  The hidden curriculum is at play in all spheres of medical life, and it pays to look around from time to time to ensure that you aren’t sleepwalking into a culture that is detrimental to the safe conduct of healthcare, but are an active participant in a culture that promotes sharing of lessons, and fosters and develops individuals as they climb the greasy pole of their medical careers.



Too much information?

15 Jul, 14 | by Toby Hillman

Information overload – by BigCBigC at


Medicine is an ever changing discipline.

One field that continues to change the face of clinical practice, and throw up new challenges is that of radiology.

The body no longer hides it’s secrets beneath skin that requires a surgeon’s skills to open up and explore, but can be encouraged to give them up through the various modalities of imaging that have been developed over the past few decades.

I remember the importance of imaging from my days as a house officer – being instructed to go and ‘lie in the scanner until they agree to do the CT’
Although perhaps I should reflect on my centrality to the team if I could be dispensed for long periods of time essentially obstructing other people’s work…

The role of radiologist has changed over time too – from gatekeeper to service provider in the eyes of one US-based specialist.  The close working relationship I have with my radiology colleagues, and my adventures into the world of imaging with my portable ultrasound remind me on a regular basis the pivotal role imaging plays in the work I do.

But, the advances of radiology throw up new challenges…

Incidental findings are both the blessing and the curse of anyone involved in the requesting, interpretation and communication of scan findings.  The report which lands in one’s inbox with just the reassuring answer one was looking for, only to have another three or four lines highlighting a completely unforseen abnormality is the start of a challenging clinical problem.  The issue tends to be outwith the usual scope of the clinical practice of the requester, and therefore usually requires the invovlement of another team to assist or advise on the next steps to further investigate, to a satisfactory conclusion for patient and clinician.

For the patient, the finding can be a distressing bolt from the blue – delivered by someone who is similarly surprised, and the news tends to herals a new round of investigations, referrals and appointments.   However, incidental findings can also be a huge positive in the end – especially if the incidentaloma turns out to be something that would have progressed unchecked if it had not been noticed.  Screening programmes – a notoriously difficult area of medicine almost rely on generating incidentalomas in asymptomatic patients for this very reason.  The debates which rage around screening are well covered elsewhere, and I think I can leave you to look them out for yourself.

The paper which prompted my thinking about incidentalomas was this one on the management of adrenal incidentalomas in British district general hospitals.  The authors looked at the reports of 4028 abdominal CT scans in Northumbria, and found that managment of adrenal incidentalomas was poor.  There were 75 patients with adrenal incidentalomas.  Only 13 were referred for specialist assessment, and sadly, of the 62 not referred, 26 were found to have inoperable metastatic malignancy.  The authors discuss the potential implications for patients who aren’t investigated, and also note that a significant proportion of the requests for the abdominal CT was in the staging of other malignancies.  The finding of an adrenal tumour in this context is a particular dilemma – as the consequences of delaying for biochemical evaluation can be significant, as can the consequences of not identifying a functioning adenoma, or even a phaeochromocytoma.

This dilemma encapsulates one of the challenges that modern doctors have to face – how to interpret a finding, adhere to guidance that is appropriate, and yet progress the care of the patient in a timely fashion that leads to the best outcome in that individual case.   As the role of the doctor changes over time, one key aspect of the duties we have is to take the information available to us, and advocate effectively for our patients – in partnership with them.  Therefore we owe it to our patients not to shrug our shoulders, or hang our heads when we come across the unexpected ‘gift’ of an incidental finding, but instead should try to embrace the opportunity to guide our patients through the often bewildering pathways that lead to a diagnosis.

So no, our imaging colleagues aren’t giving us too much information – it is up to us to use this immense resource wisely, and then, when unexpected findings are thrown up – it is up to us to manage them appropriately for each individual patient in their own individual context.

Latest from Postgraduate Medical Journal

Latest from PMJ