Recommending Privately Developed FemTech in Healthcare Part 2: Understanding Healthcare Professionals’ Responsibilities

By Anna Nelson, Maria Tzanou and Tsachi Keren-Paz

In the previous blog, we introduced the issues associated with healthcare professionals (HCPs) recommending privately-developed FemTech apps. In this second blog, we turn our attention to regulatory considerations associated with this practice. There are two distinct questions which could be asked here: (1) whether HCPs should recommend these apps at all, (2) where HCPs do recommend these apps, what are their associated responsibilities. Given the popularity of FemTech apps, and the realities of their use for healthcare purposes, for pragmatic reasons we will address the latter question. 

Data Protection 

Ensuring that patient information is kept confidential and safe is an important component of medical practice – and is necessary for maintaining trust in the healthcare system. Given the privacy concerns associated with FemTech apps, one may assume that data protection law would give rise to relevant obligations. However, under the UK General Data Protection Regulation (the primary source of such data protection obligations) HCPs recommending FemTech apps to patients are not controllers because they are not deemed to ‘determine the purposes and means of the processing of personal data’, nor do they process the personal data themselves. As such, they do not have obligations imposed upon them under data protection law. However, we note that the GDPR nonetheless provides useful guidance when considering what is important to assess in terms of privacy and security of FemTech apps. For example, the UK GDPR includes several data protection principles which could aid with the process of evaluating the risk to (individual) patient’s which might arise from the use of a femtech app.  

Tort Law 

A second avenue of potential liability arises from tort law, a branch of civil law focussed on wrongdoing between two private persons (as opposed to between a person and the state).  Could a HCP be liable in tort for harms from the use of a recommended app? Three distinctions are relevant here: (1) between physical harms (relating to functionality) and privacy harms; (2) between primary and secondary liability (establishing the latter depends on finding a third party liable as a primary defendant. where the defendant’s liability is accessory); and (3) between different understandings of the right analogy to the app’s controller – as a defective product or as a contractor.  

Negligent reliance by HCP on the data gathered by FemTech apps to make diagnostic and treatment decisions might be a breach of a duty of care and lead to liability under negligence, given concerns about apps’ staggering lack of accuracy. The more interesting, and contested, question is liability for data breach by a recommended app. If the doctor recommends a defective product whose use causes a physical injury, they should be liable, notwithstanding the possible liability of the manufacturer. This is so, provided that a competent doctor would have been aware of the risk, so would either have avoided recommending the product, or at least have warned against that risk, or against a certain way to use the product. Rather, the problem is that privacy harms are not actionable in negligence. 

Though doctors are not data controllers for the purposes of data protection law, they do owe their patients equitable common law obligations of confidentiality and privacy. This includes an obligation to keep the information secure.  If the controller is conceptualised as a contractor helping the doctor to discharge their duty to diagnose, possibly the doctrine of non-delegable duty is applicable, so the controller’s data breach will be attributed to the doctor. Alternatively, possibly, as the app is recommended by the doctor as a diagnostic tool and is insecure, this may amount to a direct breach of confidence. The counter argument to this would be that the information on the app is not a medical record possessed by the doctor, thus a direct breach cannot be established (see: Case Study A). 


Case Study A

 Bee’s doctor suspects that she suffers from Premenstrual Dysphoric Disorder (PMDD). Obtaining a PMDD diagnosis involves the patient keeping a diary of their symptoms for a number of months. Bee’s doctor suggests that it might be easiest for her to download a menstrual tracking app from the App Store in order to keep this diary – noting that a number of commercially available trackers have a function allowing you to produce a ‘doctors report’ detailing the inputted data.

One could argue that this app is being used as a diagnostic tool by the doctor or that this is merely a convenient means for the patient to collate their data which will then be used by the doctor for diagnosis; arguably, on the latter interpretation, the doctor has no professional obligations, at least not for breach of confidence, regarding the data in the app.


 

Professional Regulation  

Finally, all healthcare professionals practising in the NHS are subject to professional standards – failure to conform to which can result in disciplinary action being taken by the relevant regulator (for doctors, the General Medical Council). Matters pertaining to data protection, privacy and standard of care are also of relevantce in the context of professional regulation.   

Doctors are required to take the same care when recommending digital health tools “as they do when issuing traditional prescriptions.” The requirement that doctors “propose…effective treatments based on the best available evidence” requires ensuring that any digital health tool is “is safe, indicated, effective, and regulated so that any risks are mitigated”. Similarly, doctors must use their clinical expertise in determining “suitable” diagnostic investigation, ensuring these “meet the needs of the patient”. This raises an important question; what level of knowledge and evidence about, and understanding of, a particular app is (or should be) required before this can be appropriately recommended by a healthcare profession?  

Good Medical Practice also makes clear that medical professionals have a duty to “protect patients’ personal information from improper disclosure.”  Likewise, the Nursing and Midwifery Council Code emphasises the need to “respect a person’s right to privacy in all aspects of their care .  Arguably, recommending that a patient uses an app without first assessing the adequacy of its privacy policy may result in a failure to fulfil this duty to protect patient information. 

Whose Responsibility: The clinician’s or the NHS’s?:  

In considering how to respond to the different legal and regulatory issues identified in the blogs, the following question presents itself: Does the responsibility for vetting apps which are recommended by individual clinicians lie with the relevant NHS body, or with the clinician themselves? Greater clarity on this issue is needed to protect the interests of both patients and clinicians. This relates to the distinction between direct and vicarious liability and the proper identification of the individuals whose acts and omissions are the basis of imposing responsibility on the NHS body and who might be subject to disciplinary proceedings.    

Conclusion 

The privacy policies associated with FemTech apps and wearables are often long and complex, as our review of 45 apps demonstrated. Given that healthcare services are increasingly under-resourced, careful consideration needs to be given about how to create usable and efficient mechanisms to ensure that HCPs are appropriately informed about the data practices of specific privately-developed FemTech Apps. Furthermore, HCPs responsibilities regarding recommending privately-developed FemTech apps need to be made clear, for both their own sake and the sake of their patients. 


About the Authors 

Anna Nelson (she / her) is currently a Postdoctoral Research Associate on the Leverhulme-funded “FemTech Surveillance: Gendered Digital Harms and Regulatory Approaches” project at the University of Sheffield. She holds a PhD in Bioethics and Medical Jurisprudence from the University of Manchester, and has wider research interests in law and childbirth, reproductive technology and gendered experiences of healthcare.  

Tsachi Keren-Paz is Professor of Private Law in the School of Law at the University of Sheffield. He is a Co-Investigator of the Leverhulme FemTech Surveillance Project, and the author of three monographs: Torts, Egalitarianism and Distributive Justice (Ashgate, 2007), Trafficking: A Private Law Response (Routledge, 2013) and Egalitarian Digital Privacy: Image Based Abuse and Beyond (BUP, 2023).  

Maria Tzanou is a Senior Lecturer in Law at the University of Sheffield, UK. Her research focuses on privacy, data protection, AI, surveillance and the regulation of emerging technologies. She is the Principle Investigator of the Leverhulme FemTech Surveillance Project 

More information about the Femtech project can be found here: https://www.sheffield.ac.uk/law/research/centres-and-institutes/sciel/projects/femtech-surveillance-gendered-digital-harms-and-regulatory-approaches  

(Visited 46 times, 1 visits today)