Artificial Intelligence in Patient Narrative Interventions: Opportunities, Obstacles, and a Path Forward

Blog by Angelo Chen

Narrative medicine interventions have demonstrated benefits in clinical practice, including improved patient-provider relationships, quality of life, and promotion of positive health outcomes. This is likely due to the emphasis on “deep and generous listening, along with patient-sensitive, inclusive care.”1 A common way to incorporate narrative medicine into clinical practice is through the use of patient narratives, which involve patients writing about their life experiences with the help of providers or volunteers. Some notable programs utilizing patient narratives include the US Department of Veterans Affairs’ My Life, My Story program,2 Chou et al.’s intervention for HIV patients,3 and Nomura et al.’s patient-authored medical record.4 Their analyses suggest that patient narratives have tangible positive effects on the patient and provider experience.5

Interestingly, these programs all present a common criticism of the patient narrative model: the significant amount of time required to write a narrative. The whole writing process takes many hours, requiring patient-writer conversations, narrative writing, and finally editing. Most aspects of this patient narrative co-creation process are performed entirely by humans, who process extensive information to craft a shortened narrative. One promising way to speed up the process of narrative creation is generative artificial intelligence (AI), which refers to AI designed to create or generate new content from trained parameters, such as text, images, or music.6 Generative AI is capable of optimizing the information-intake and writing process in narrative co-creation by highlighting key parts of patient stories from hours-long conversations and nearly instantaneously completing a draft narrative. This capability would allow AI to play a supplementary role in narrative medicine, similar to existing technologies used to record the human voice and transcribe it with high fidelity. Therefore, the use of AI in narrative medicine presents an opportunity to address the time and labor constraints of narrative writing by condensing a lengthy listening, writing, and editing process into mere moments.

While AI implementation in narrative creation can improve efficiency, it is not without flaws and concerns.7 Many narrative medicine programs emphasize the human-to-human narrative co-creation as a core component of their success. The work of crafting a narrative is not solely about the written end product, but also about the human interactions during the creation process. Substituting these interactions with a computer program would prevent crucial patient-provider exchanges from happening. While providers can still read a final patient narrative, they will no longer have to engage in the narrative’s co-creation. If much of narrative medicine’s power arises from patients feeling heard and understood by their providers, then that power is undercut by using AI. In an increasingly digital age, it is worth exploring to what degree this is a detriment to patients.

The perpetuation of algorithmic and systemic biases represents another area of concern with AI programs.8 Narrative writing AI programs need to be trained on human-written narratives, which universally contain subconscious biases. Consequently, the program training process will integrate these biases into the program’s writing methodology. Repercussions include the perpetuation of negative stereotypes and the treatment of patients as groups rather than as individuals. Given that narrative medicine aims to elevate each patient’s individual experiences, values, and concerns, such aggregation of patients’ stories presents a clear problem.

A larger concern with AI implementation is the lack of AI programs specific to narrative medicine. While writing-focused AI programs exist, they are not optimized for patient narrative creation. For instance, a program trained on biography samples could likely write a biography of a patient, but it would lack the emotional depth and storytelling necessary for an effective patient narrative. To date, we are unaware of any AI programs that have been trained on and specialize in crafting patient narratives. While some companies, such as Remento, have developed software that can translate spoken stories into text, they are not directly applicable to patient narrative practice because the programs are not trained to contextualize the clinical environment.

Despite flaws and concerns, the strengths and limitations of AI technology can be balanced by implementing a hybrid approach. This would be accomplished through the simultaneous use of both a human listener and AI, which allows the human to converse with the patient while AI listens, transcribes, and processes the conversation to compose a narrative. Following the conversation, the provider and the patient can review the AI-written narrative together to address accuracy, voice, and potential biases before adding the narrative to the medical record. This hybrid approach maintains and possibly even enhances the human-to-human strength of narrative medicine, as the provider can fully focus on attentive listening and recognition of the nonverbal aspects of the narrative. This approach also makes use of the AI program’s efficiency, as a narrative could be completed in one session rather than multiple sessions, helping more patients share their stories.

Artificial intelligence holds promise as a technology for future development in narrative medicine, given its ability to simplify time-consuming processes in narrative creation. That being said, AI does have potential pitfalls, including a reduction in patient-provider interaction, a lack of existing specialized AI models, and the potential to perpetuate existing biases in writing. Narrative medicine programs should consider integrating AI as a supplement to writing processes through the use of the proposed hybrid model, embracing the efficiency benefits it provides while maintaining crucial patient-provider interactions.

 

Angelo Chen is a student at Rice University studying health sciences, biosciences, and medical humanities. He plans to attend medical school and continue his work on narrative medicine and related topics in the medical humanities.

 

References

[1] Michelle Loy and Rachel Kowalsky, “Narrative Medicine: The Power of Shared Stories to Enhance Inclusive Clinical Care, Clinician Well-Being, and Medical Education,” The Permanente Journal 28, no. 2 (2024): 95, https://doi.org/10.7812/TPP/23.116.

[2] Tonya J. Roberts et al., “The My Life, My Story Program: Sustained Impact of Veterans’ Personal Narratives on Healthcare Providers 5 Years after Implementation,” Health Communication 36, no. 7 (2020): 829–36, https://doi.org/10.1080/10410236.2020.1719316.

[3] Jonathan Chou et al., “A Value-Added Health Systems Science Intervention Based on My Life, My Story for Patients Living with HIV and Medical Students: Translating Narrative Medicine from Classroom to Clinic,” in The Medical/Health Humanities-Politics, Programs, and Pedagogies, (Springer, 2022), 147–66, https://doi.org/10.1007/978-3-031-19227-2_10.

[4] Chizuru Tsubonouchi et al., “The Patient-Authored Medical Record: A Narrative Path to a New Tool in Psychiatric Nursing,” Archives of Psychiatric Nursing 39 (2022): 46–53, https://doi.org/10.1016/J.APNU.2022.03.009.

[5] Roberts et al., “The My Life, My Story Program,” 3–4; Chou et al., “A Value-Added Health Systems Science Intervention,” 668–73; Tsubonouchi et al., “The Patient-Authored Medical Record,” 49–51.

[6] Christy Boscardin et al., “ChatGPT and Generative Artificial Intelligence for Medical Education: Potential Impact and Opportunity,” Academic Medicine 99, no. 1 (2024): 22–27, https://doi.org/10.1097/ACM.0000000000005439.

[7] John D. McGreevey et al., “Clinical, Legal, and Ethical Aspects of Artificial Intelligence-Assisted Conversational Agents in Health Care,” JAMA 324, no. 6 (2020): 552–53, https://doi.org/10.1001/JAMA.2020.2724.

[8] Trishan Panch et al., “Artificial Intelligence and Algorithmic Bias: Implications for Health Systems,” Journal of Global Health 9, no. 2 (2019): 1–5, e020318, https://doi.org/10.7189/JOGH.09.020318; Daiju Ueda et al., “Fairness of Artificial Intelligence in Healthcare: Review and Recommendations,” Japanese Journal of Radiology 42 (2024): 3, https://doi.org/10.1007/S11604-023-01474-3.

(Visited 43 times, 23 visits today)