An important aspect of my job as a clinical director was acting as a buffer between the ever increasing number of new policy initiatives gathering in the system at large and the clinical staff I managed. I took the view that front line staff do work that is demanding—intellectually and emotionally—and need to be distracted from this as little as possible. Of course, there were new initiatives that I felt would directly or indirectly improve patient care and these we discussed and implemented as best we could; but the majority were peripheral to the task in hand and despite their tone of urgency, many would fade away if I ignored them for long enough. Indeed many of them would originate from different parts of the system and would directly contradict each other—for example, the attempt to dictate the percentage of time staff spent in direct contact time with patients and the ever increasing expectation to attend new mandatory trainings and other meetings.
Unfortunately, as control-freakery and micromanagement became ever more prevalent, this stance became harder to sustain. A “must do” culture took over. People appointed to implement a policy behaved as if they were carrying out an order, desperate to tick the box to show they’d succeeded. They appeared scathing towards the need for discussion, for thinking through unwanted consequences, for exploring the applicability of the policy to the particular locality or specialty, and oblivious of the need to win the argument and inspire ownership of the new policy in the teams who would have to make it work. The pressure on me—a highly paid, experienced clinical manager—to succumb, even when there was a matter of conscience at issue, was enormous.
Stanley Milgram is famous for a series of experiments that explored our relationship to authority. In these, the participants—ordinary people—were told that the experimenters were exploring the effects of punishment on learning. They were instructed to apply increasingly powerful electric shocks, rising to 450 volts, to apparent “students” in response to a failure to learn a task. Despite seeing the physical distress caused by the electric shocks (in fact feigned by the “students” but seen as real by the participants) most of the participants continued to do as they were told, when, despite their questions, and in some cases upset and protest, they were sternly instructed to continue. The physical distress observed escalated from the apparent victims of the regime banging on the walls, to complaining about their heart condition, and eventually to collapsing completely.
Variations on the experiment have been carried out in many different countries and cultures, with the percentage of participants who are prepared to inflict fatal voltages remaining remarkably constant, at 61-66%, according to a meta-analysis (Blass 2000). In general, where the victim’s physical immediacy was increased, the participants’ compliance decreased. The participants’ compliance also decreased when the authority’s physical immediacy decreased, for example if contact was over the telephone. In 2009, a version of the experiment was repeated as part of a television documentary entitled “How violent are you?” (Horizon BBC2). Of the 12 participants, only three refused to continue to the end of the experiment.
It seems that we are much more likely to succumb to malignant authority than we like to think. This makes a “must do” culture where we habituate to carrying out orders unthinkingly, very dangerous. Perhaps of most relevance to our situation working in the NHS, the highest compliance was in experiments where the task of implementing the shocks was divided up and the participants presumably felt they were only small cogs in the system.
Penny Campling is a psychiatrist and psychotherapist and was a clinical director for many years. She has recently co-written a book entitled “Intelligent Kindness: reforming the culture of healthcare.”