This week’s blog is written by Gerry Bolger (@DigitalGerry). Gerry is both a nurse and an independent digital clinical safety officer working with both NHS and health and social care manufacturers, ensuring products meet the NHS digital clinical safety requirements
Artificial Intelligence (AI) is no longer a futuristic concept – it is rapidly reshaping healthcare delivery across the UK. From enhancing diagnostic accuracy to optimising hospital workflows, AI offers a wide range of benefits. However, its implementation also brings unique challenges, particularly around digital clinical safety. This blog explores the opportunities AI presents, but focuses on the responsibilities of healthcare staff, especially when identifying and responding to anomalies.
Benefits and Opportunities of AI in UK Healthcare
AI is already demonstrating tangible value in several areas:
- Improved diagnostics: AI tools are helping clinicians diagnose conditions faster and with greater accuracy. For example, researchers at the University of Cambridge developed an AI tool that matches pathologist-level accuracy in diagnosing coeliac disease by analysing biopsy images, significantly speeding up the diagnostic process (The Guardian, 2025a).
- Personalised treatments: AI is aiding the tailoring of treatment plans. A 2025 study showed that AI could predict which men with high-risk, non-metastatic prostate cancer would benefit from abiraterone, reducing unnecessary exposure to side effects (The Guardian, 2025b).
- Operational efficiency: A compelling example of AI improving operational efficiency is the implementation of AI-powered scheduling and rostering systems by NHS Greater Glasgow and Clyde. In collaboration with Microsoft and NHS Scotland, the Health Board developed a system that uses AI to better align staff rotas with patient demand, significantly reducing agency staff costs and increasing shift coverage reliability (NHS Scotland, 2023). This real-world deployment demonstrates how AI can streamline workforce management, leading to financial and care delivery benefits.
- Reducing administrative burden: AI-enabled ambient scribing tools are starting to alleviate documentation burdens for clinicians, giving them more time for direct patient care (NHS England, 2024).
These innovations contribute to better patient outcomes, more efficient care, and potential cost savings. Yet, the widespread adoption of AI requires careful oversight to ensure that these technologies do not compromise safety.
Challenges and responsibilities for healthcare staff
While the benefits are compelling, AI also introduces significant challenges—particularly concerning clinical safety, ethical considerations, and regulatory compliance. One of the most critical areas is the detection and management of anomalies in AI behaviour.
- Anomalies and unexpected outcomes
AI tools, particularly those using machine learning, are not always transparent in their decision-making. This can result in:
- Unintended biases in clinical decision support
- Errors in pattern recognition (e.g., false positives/negatives)
- Data drift leading to unreliable outputs over time
Healthcare staff are often the first to detect these issues in real-world settings. It is essential that they understand how to identify, report and respond to these anomalies.
- What to do if you spot an anomaly
If you notice unexpected or erroneous behaviour from an AI system, the following steps should be taken:
- Document and report: Record the anomaly in detail and report it through your organisation’s safety reporting system.
- Contact your Clinical Safety Officer (CSO): The CSO is trained in assessing for clinical risk and will assess the issue, initiate a risk review, and decide on any immediate actions required.
- Collaborate with technical teams: Your input is vital to help developers understand the context and implications of the anomaly.
- Monitor patient safety: Ensure that any decisions influenced by AI are reviewed, especially if an anomaly may have affected care.
- Review and update: Help update internal procedures and safety documentation to reflect the learnings from the incident.
The Role of Regulation: AMLAS, DCB0129 and DCB0160
To ensure AI safety in healthcare, the UK has established a rigorous regulatory framework:
- AMLAS (Assurance of Machine Learning for use in Autonomous Systems) provides a structured process for justifying the safety of AI. NHS England has issued healthcare-specific supplementary guidance to adapt AMLAS to clinical environments (NHS Digital, 2022).
- DCB0129 and DCB0160 are mandatory standards under the Health and Social Care Act 2012. DCB0129 applies to system manufacturers, while DCB0160 governs how healthcare providers implement and manage these systems safely (NHS Digital, 2024).
- Clinical Safety Officers (CSOs) are registered clinicians responsible for overseeing adherence to these standards. They play a key role in investigating anomalies and ensuring ongoing compliance.
Building a Culture of digital clinical safety
Effective AI integration requires a strong safety culture:
- Training and awareness: All staff should be trained in recognising and managing digital clinical safety risks, including anomaly detection.
- Leadership support: Clinical leaders must prioritise safety and invest in robust governance structures.
- Continuous monitoring: Healthcare organisations should implement systems for monitoring AI performance post-deployment.
Conclusion
AI has enormous potential to enhance healthcare in the UK, but it is not without risk. Clinical staff are at the frontline of these changes and play a vital role in safeguarding patient safety. By remaining vigilant, reporting anomalies, and collaborating with safety officers and developers, healthcare workers can ensure that AI is used safely, ethically, and effectively.
Author
Gerry Bolger, RN MHM
Gerry is both a nurse and an independent digital clinical safety officer working with both NHS and health and social care manufacturers, ensuring products meet the NHS digital clinical safety requirements.
References
NHS England (2024) Guidance on the use of AI-enabled ambient scribing products in health and care settings. Available at: https://www.england.nhs.uk/long-read/guidance-on-the-use-of-ai-enabled-ambient-scribing-products-in-health-and-care-settings/
NHS Digital (2022) Healthcare Supplementary Guidance for AMLAS. Available at: https://digital.nhs.uk/services/clinical-safety/documentation/healthcare-supplementary-guidance-for-amlas
NHS Digital (2024) Clinical Risk Management Standards: DCB0129 and DCB0160. Available at: https://digital.nhs.uk/services/clinical-safety/clinical-risk-management-standards
The Guardian (2025a) Researchers develop AI tool that could speed up coeliac disease diagnosis. Available at: https://www.theguardian.com/science/2025/mar/27/coeliac-disease-diagnosis-ai-tool
The Guardian (2025b) New AI test can predict which men will benefit from prostate cancer drug. Available at: https://www.theguardian.com/society/2025/may/30/new-ai-test-can-predict-which-men-will-benefit-from-prostate-cancer-drug
NIHR (2023) New report on 10 promising AI interventions for healthcare. Available at: https://www.nihr.ac.uk/news/new-report-on-10-promising-ai-interventions-for-healthcare/34020
NHS Scotland (2023) AI-enhanced staff rostering improves efficiency at NHS Greater Glasgow and Clyde. Available at: https://www.gov.scot/publications/artificial-intelligence-in-scotland—case-studies/pages/nhs-rota/