The medical world is abuzz with the potential of new technologies, but to translate this into better healthcare, medical training has to keep pace, says Zack Hassan
History often shows us the importance of adopting new developments in healthcare. In the 19th century, for example, surgeons who did not use the new antiseptic techniques had higher rates of infection than those who did. Today, the newest medical tools are coming from computer science, but are junior doctors being adequately prepared for this next technological leap?
We are only beginning to exploit the potential applications of automation, machine learning, and big data in medicine. Yet few doctors know any programming skills and many workplaces are still waiting to update to Windows 10, let alone electronic notes or prescribing. I’ve been fascinated by how recent issues of The BMJ have covered social media healthcare influencers, developments with the health secretary’s favoured online consultation app GP at Hand, and the use of machine learning to detect adverse events from vaccination. However, I am struck continually by how these articles all suggest that the medical mainstream doesn’t yet fully understand the implications of today’s technological landscape or know what to do with the new capabilities it grants us.
This presents a strategic problem. Doctors have a much bigger part to play in guiding technological development and encouraging new systems to become more user friendly, patient centred, and efficient for frontline healthcare delivery. However, as things stand, much of this work is being left to commercial and political forces, which affect the NHS procurement process. Many doctors will understand the dangers of this in light of the Babylon Saga. I worry that the medical profession, and patients, will be at the mercy of these forces until a greater number of doctors can combine their clinical acumen with advanced software and programming skills.
Is it really for doctors to tackle this? While some may feel that a doctor’s remit should not include such skills, there are some good reasons why it should. As doctors become more senior it is important for them to engage more closely with the evidence base behind the treatments and interventions they use, shaping its development and contributing to it to better understand the context in which we are caring for our patients. This understanding is also vital to our quality improvement activities. How will my generation be able to do this effectively if we lack the skills to understand research that is generated by machine learning? Should we not be able to manipulate large datasets we have collected to see if it is worth spending time on a particular research question?
Additionally, specialists trying to improve their unit’s outcomes could learn from other sectors in their use of big data. For example, the US retailer Target received press coverage in 2012 for being able to predict when customers became pregnant from big data containing their shopping habits. Whereas Target used this information to try to sell baby-cribs, within healthcare such techniques could be used to diagnose colon cancer earlier, potentially improving five year survival rates—one area where the UK lags behind the European average. It is difficult to see who would be best placed to implement such initiatives if not doctors themselves.
How do we get started? In reflecting on the digital challenges we face, it seems to me that our system of medical training and education could be doing more to prepare us. We could start by offering medics protected time throughout their careers to use the many unstructured opportunities available. At medical school, student-selected components and intercalated degrees could be offered in applied computer science. At postgraduate level, doctors could broaden their portfolio to include programming skills in the same way that we do for education, leadership, and academia. Being a clinician-programmer in this way could have career advantages, making it easier for doctors to achieve existing training requirements, such as quality improvement projects, on a larger scale. Consultants and medical directors could get assistance or training to apply digital tools to their specific projects of interest.
The medical profession seems to be lagging behind other industries when it comes to our digital knowhow. When budgets are strained ever tighter, this is to our detriment, as a lot of time and money could be saved by exploiting big data, automation, and machine learning. Medical education and training in the 21st century needs to promote higher levels of digital literacy to keep up with these trends. After all, it is the medical profession who are best placed to realise the potential applications of new technology in medicine and to decide how they should be implemented.
Zack Hassan is an FY1 doctor currently working in colorectal surgery in Edinburgh’s Western General Hospital.
Twitter: @MontereyZack
Competing interests: None declared