Alexa, does this look infected? – We need to talk about safely regulating the digitisation of healthcare, now.

By Catriona McMillan.

The sale of health technologies for personal use has boomed in the past few years. At-home access to health information, and the means to track one’s health stats, have been criticised for unnecessarily increasing pressure on NHS services, and in some cases risking user safety. Perhaps surprisingly, however, most of these technologies are barely covered by regulation unless they claim to be diagnostic.

Until now, NHS involvement in this trend has been relatively small-scale, but in early July the Department for Health and Social Care (DHSC) announced the roll-out of a new partnership between NHS England, and Amazon Alexa, a voice-controlled cloud-based smart speaker. The partnership was initiated by NHSX, a joint unit of the DHSC who ‘drive forward the digital transformation’ of health and social care as part of the NHS’s long-term plan. It has been presented as a tool for increasing access to NHS.UK content via an API. It has been said that Alexa will use algorithms to search information on the NHS website in order to answer questions about treatments for, and symptoms of common illnesses, for example: ‘how do I treat a migraine?’ in a similar vein to what one can already ask Alexa, like ‘how do you make pancakes?’  While there is undoubted potential for increasing accessibly to NHS information here, concerns have been raised about user privacy, and the need for evidence on the initiative’s safety and efficiency after the fact.

Yet these issues should have been raised before now; in the wake of the NHS’s continued ‘digital transformation’, vital rigorous, open and public discussions surrounding ethics and policy have arguably been lost.

For one, concerns over data sharing, and user privacy have shot to the forefront of the debate. The DHSC quickly responded by ‘categorically’ stating that ‘no patient data is being provided to Alexa or Amazon.’ Personal and health data in the UK is protected by the GDPR — but we also know that Alexa records and stores conversations, unless asked to delete them. Unfortunately, data is not the only concern that has not yet been allayed.

Another concern is that there has been little evidence to support the partnership’s aims: to allow more users to access information in order to take more control of their health and care at home, and, therefore alleviate pressure on GPs and hospitals. However, it not unlikely that its main users will already have access to the internet and an Alexa, which requires financial means and confidence in using technology (amongst other privileges). Moreover, even if we find a way to increase access to voice-activated technology, in the age where online health information has increased unnecessary GP and hospital visits, the initiative has the potential to increase (rather than alleviate) pressure on the NHS.

Third, we still require independent research to ensure that the advice given is safe. While NHS.UK content is clinically validated, and thus probably one of the safest online sources of health and care information that one can access, to make a trite point, clinical validation does not necessarily guard patient safety. Not only does voice-activated searches raise concerns about accessibility and resources, but it is also unclear whether Alexa can provide the depth of information and warning signs that a website can. It relies on users being able to ask the right questions, and the means to know when to seek help, even if it has not been recommended.

In the wake of concerns about developments such as these, practitioners and the public alike often look to the law for protection. Here, however, we have seen the emergence of a legal grey area, if not an outright ‘gap’. In the UK, healthcare technologies that provide diagnoses are regulated by the Medicines and Healthcare Products Regulatory Agency. Most health technologies for personal use, including fitness trackers and Alexa, however, do not fall under the MHRA’s ambit as they do not provide diagnosis. This area of tech is thus continuing to proliferate with little in the way of regulation (except in more general forms such as the GDPR) built on a consideration of the above issues in-context

All of this is not to say that the digitisation of healthcare cannot be a force for good, but we must not continue to chase the ‘shiny new thing’ without proper medical, ethical and legal deliberation. The ethical concerns that have arisen out of this announcement highlight the urgent need for an informed and open debate about how we should regulate health technologies for personal use. While regulation is not always the answer, conversations surrounding whether, and how to regulate are essential to safeguarding against them.

Author: Catriona McMillan

Affiliation: Senior Research Fellow in Medical Law and Ethics at the University of Edinburgh.

Social media account: @katy_mcmillan

(Visited 882 times, 1 visits today)