The uncomfortable truth about our digital reality
The recent “letter” from NHS England’s CCIO regarding Ambient Voice Technologies (AVTs) served as a wake-up call to practitioners about responsibilities when deploying digital health technologies and AI in clinical settings[1], and exposed an uncomfortable truth: many clinicians were unaware of the legal obligations surrounding digital tools.
In our clinical and commercial work, we see a recurring pattern — well-intentioned healthcare professionals deploying technologies without adequate understanding of clinical safety requirements, compliance frameworks, or legal responsibilities[2–4].
This represents more than a compliance gap—it’s a leadership challenge that could derail the NHS’s digital future. As we stand on the brink of its most ambitious digital transformation, we risk repeating these same mistakes on an unprecedented scale.
This isn’t about perfectionism or box-ticking. These standards exist because digital technologies, like pharmaceuticals, can cause harm. The thalidomide scandal led to the Medicines Act 1968[5]. Fifty years later, despite multiple examples of harm arising from the use of digital health technologies, we’ve convinced ourselves that the same rigour isn’t required. There is a widespread false perception that software is simply less risky.
Digital health’s scale, unsupervised use, and perceived infallibility[6] should prompt caution. Poorly designed IT systems have already harmed patients—from the National Breast Screening system failure affecting 122,000 women[7] to QRisk2 code mapping errors impacting 270,000 individuals[8]. That reflects only known cases; other instances, like the 2018 Docman error[9] requiring manual reviews of thousands of documents, may have caused undetected harm.
Despite this, many remain unconvinced of the risks, believing good intentions provide sufficient protection. They don’t.
Three critical transitions, one common failure
The NHS 10 Year Plan, aptly named “Fit For the Future”[10], articulates three ambitious transitions that will define its future:
- analogue to digital
- hospital to community
- sickness to prevention.
Each represents a transformative shift in how we deliver care, amplifying the risks of inadequate digital and AI governance.
Analogue to digital: As we digitise everything from patient records to clinical decision-making, we’re fundamentally altering how clinical information flows, how decisions are made, and how errors can propagate. Without proper clinical safety assurance, we risk creating digital systems that are less safe than their analogue predecessors, at far greater scale.
Hospital to community: Hospitals, whilst far from perfect, typically have dedicated IT departments and governance structures. Community settings often lack these resources, yet the 10 Year Plan expects them to adopt sophisticated AI-driven tools without commensurate support for implementation or ongoing assurance.
Sickness to prevention: This transition mirrors what we must achieve in digital governance — proactive, safety-by-design approaches rather than reactively managing consequences. Prevention means getting foundations right from the start.
The familiar tune of unfunded responsibility
As a former GP and current local authority Public Health consultant, we recognise this melody all too well. Throughout our careers, services outside of secondary care have repeatedly been handed new responsibilities — enhanced services, quality frameworks, complex reporting requirements — without corresponding resources.
We’re seeing this exact pattern with digital health adoption. GP practices, local authorities, and social care providers are expected to evaluate, procure, implement, and monitor sophisticated AI-enabled systems whilst managing existing pressures without expert support. It’s a recipe for both failure and harm.
Beyond tick-box compliance
Better resourcing alone will not solve this problem. We need to revisit how we approach digital health implementation, recognising that compliance is an active process requiring expertise.
Manufacturers and suppliers must be required to provide assurance against the NHS standard DCB0129. Some appear to treat clinical safety documentation as an afterthought, rushing products to market without adequate assurance. Healthcare leaders must demand evidence of clinical safety work before procurement, not after deployment.
Service providers need foundational training in digital health and AI governance. Clinical leaders cannot effectively oversee what they don’t understand and safety officers need dedicated time and resources to upskill. We need systematic education programmes that equip all staff with the knowledge to ask the right questions, recognise inadequate assurance, and report incidents.
Better liaison mechanisms are essential between vendors and service providers. The current approach—where manufacturers provide minimal documentation and deploying organisations are expected to figure out the rest—is unsustainable and unsafe.
Learning from other industries
Other high-risk sectors have solved similar challenges. Aviation safety didn’t improve through good intentions, but through rigorous systems, mandatory training, and adequate resourcing of safety functions. Pharmaceutical regulation works because we properly fund both industry oversight and healthcare provider education about drug safety. In each case, these were driven by incidents of harm.
Digital health and AI deserve the same commitment, but how about we do it without first waiting for another scandal to hit? Ambient scribes, online consultation tools, and administrative automations all have the potential for both tremendous benefit and significant harm. Treating them as low-risk consumer technologies is dangerously naive.
A call for leadership
We have a choice.
Healthcare leaders can continue the current trajectory—enthusiastically adopting digital solutions whilst inadequately governing them—and face the consequences when things go wrong. Or we can demonstrate genuine leadership by demanding the resources and expertise necessary to do this properly, whatever the setting.
We must insist on proper clinical safety documentation before signing contracts and monitor post-deployment. We must act when we see something going wrong, and invest in training programmes that build digital health literacy for all.
Most importantly, we have to recognise that digital transformation isn’t just about technology—it’s about people, processes, and the systems that keep patients safe.
The NHS 10 Year Plan’s ambitions are admirable, but good intentions won’t prevent harm. If we’re serious about digital transformation, we must be equally serious about doing it safely. The alternative, maintaining the status quo where many of our digital tools lack basic safety assurance, isn’t acceptable.
The good news is that we have the opportunity to get this right, and exceed our ambitions. This is only possible if we recognise that proper Digital Health & AI governance isn’t a luxury we can defer. It’s the foundation upon which successful transformation depends.
References
- Clover B. Exclusive: NHSE orders trusts to halt ‘safety risk’ AI projects. Health Serv J. Available from: https://www.hsj.co.uk/technology-and-innovation/exclusive-nhse-orders-trusts-to-halt-safety-risk-ai-projects/7039515.article [accessed Sept 14, 2025]
- DCB0129: Clinical Risk Management: its Application in the Manufacture of Health IT Systems. NHS Engl Digit. Available from: https://digital.nhs.uk/data-and-information/information-standards/information-standards-and-data-collections-including-extractions/publications-and-notifications/standards-and-collections/dcb0129-clinical-risk-management-its-application-in-the-manufacture-of-health-it-systems [accessed Feb 9, 2025]
- DCB0160: Clinical Risk Management: its Application in the Deployment and Use of Health IT Systems. NHS Engl Digit. Available from: https://digital.nhs.uk/data-and-information/information-standards/information-standards-and-data-collections-including-extractions/publications-and-notifications/standards-and-collections/dcb0160-clinical-risk-management-its-application-in-the-deployment-and-use-of-health-it-systems [accessed Mar 15, 2025]
- Legislation.gov.uk. Health and Social Care Act 2012. Statute Law Database; Available from: https://www.legislation.gov.uk/ukpga/2012/7/section/250 [accessed Feb 9, 2025]
- Ferner RE, Aronson JK. Medicines legislation and regulation in the United Kingdom 1500-2020. Br J Clin Pharmacol 2023;89(1):80–92. doi: 10.1111/bcp.15497
- Cross M. IT experts call for review of ‘computer is always right.’ Law Gaz. Available from: https://www.lawgazette.co.uk/law/it-experts-call-for-review-of-computer-is-always-right/5118414.article [accessed Sept 14, 2025]
- Independent Breast Screening Review report. GOVUK. Available from: https://www.gov.uk/government/publications/independent-breast-screening-review-report [accessed Sept 11, 2025]
- Heather B. QRisk2 in TPP “fixed” but up to 270,000 patients affected. Digit Health. 2016. Available from: https://www.digitalhealth.net/2016/06/qrisk2-in-tpp-fixed-but-up-to-270000-patients-affected/ [accessed Feb 10, 2025]
- Authors O. Some GP practices having to review ‘over 1,000’ records due to Docman error. Pulse Today. 2018. Available from: https://www.pulsetoday.co.uk/news/technology/some-gp-practices-having-to-review-over-1000-records-due-to-docman-error/ [accessed Sept 14, 2025]
- 10 Year Health Plan for England: fit for the future. GOVUK. 2025. Available from: https://www.gov.uk/government/publications/10-year-health-plan-for-england-fit-for-the-future [accessed Sept 12, 2025]
Authors
Dr. Keith Grimes is a former GP with over 20 years of experience in digital health implementation, Honorary Lecturer in AI & Digital Health at Warwick University & Bayes Business School, and founder of Curistica, a clinical AI compliance consultancy. He specialises in clinical safety and digital health & AI governance.
Dr. Youssof Oskrochi is a Public Health consultant at the London Borough of Sutton, Honorary Lecturer in Digital Health at the UCL Global Business School for Health and Head of Safety & Data Protection at Curistica. He specialises in clinical safety, data protection and digital health governance with a focus on generative AI technologies.
Declaration of interests
We have read and understood the BMJ Group policy on declaration of interests and declare the following interests: The authors both offer professional digital health and AI compliance services through Curistica, of which KG is the founder.