In my work as a medical humanist, I’ve witnessed how healthcare professionals’ empathic communication skills can improve a patient’s understanding of their diagnosis, prognosis and treatment options. Now, to my surprise, a recent study shows “chatbot-generated responses outperformed physician- generated responses in both quality and empathy.” The authors conclude that chatbots can improve physician performance and patient care. (Comparing physician and artificial intelligence chatbots responses to patient questions posted to a public social media forum. Ayer, Poliak, Dredze et al JAMA Intern Med, 2023.)
In the study we learn that artificial intelligence technology offered patients an effective empathic response to their inquiries. It’s important to note that these exchanges took place in a public social media forum and not in a medical office where tone of voice and body language could be observed. In a letter to the journal’s editor about the study’s findings, the authors note, “Empathy is built on prioritizing others’ perspectives. Patients, not clinicians, should be the standard by which to judge whether their experiences were understood, shared and acted upon.” (Machine-made Empathy? Why Medicine Still Needs Humans. Cadiente, Chen and Pilkington, JAMA Intern Med Sept 11, 2023)
The topic of physicians using scripted responses reminded me of a Tumor Board meeting I attended many years ago. As a medical humanist at the hospital’s cancer center, I was invited to meetings where physicians gathered around a conference table and introduced a patient’s presenting problem, scans and pathology reports, leading to a discussion about a diagnosis and treatment recommendations. The medical information was above my head. But on one occasion, a surgeon who was distressed by his patient’s prognosis turned to me and asked for a suggestion of what to say to the patient. I realized he was asking for a script, which didn’t seem natural. Although I didn’t give him the words to say, I shared some of my thoughts about communicating with patients facing bad news. I learned at the next Tumor Board that my suggestions helped him prepare for a difficult conversation. He told his colleagues that he relied on empathy as he found the words to talk with his patient about the prognosis. I learned a lesson myself about the usefulness of prompts.
Now, as I think about the role of empathic scripts generated by artificial intelligence, I’m more open to the potential of this technology to help clinicians prepare for difficult conversations. Yet, I’m concerned that repeating words of a chatbot are not enough. Rather, these scripts should serve as prompts to remind clinicians that empathic communication can help patients understand their diagnosis, prognosis and treatment options. And, as a result, patients can be better prepared to engage in the decision-making process. There’s nothing artificial about that.