Saturday, April 29, 2023

What Does It Mean if AI Can Convey More Empathy Than Humans?

Photo by Gaelle Marcel on Unsplash
Deep thoughts for your weekend: I'm not sure what it means that ChatGPT can respond to patient questions with greater quality and empathy than human doctors, but it's nothing good.

A study used questions posted to Reddit’s r/AskDocs. It looked at doctor responses to those questions and compared them to ChatGPT responses to the same questions. These responses were evaluated in triplicate by a team of licensed health care professionals.

Generative AI responses were evaluated as both “significantly higher quality than physician responses” and “significantly more empathetic than physician responses.” It was no contest. The proportion of responses rated as good/very good quality was 3.5x higher for chatbot than physicians, and the proportion of responses rated empathetic/very empathetic was almost 10x higher for chatbot than for physicians.

"The proportion of responses rated empathetic or very empathetic (≥4) was higher for chatbot than for physicians. This amounted to 9.8 times higher prevalence of empathetic or very empathetic responses for the chatbot."

No one is suggesting a chatbot can replace your doctor. If anything, this study may suggest that AI can improve patient experience while increasing physician efficiency. Still, I find it distressing that ChatGPT can already demonstrate better “bedside manner” (or, perhaps, “screenside manner”) than doctors.

People will argue AI cannot express empathy since hardware and software are incapable of having empathy. Maybe, but if generative AI responses are perceived as more empathetic, then it is hard to argue that AI isn't incontrovertibly able to convey empathy effectively. (If a tree doesn't fall in the woods, yet everyone hears it fall, has it really fallen?)

I'll leave it to philosophers to figure out what's true or not regarding AI and empathy, but I think studies like this should be a wake-up call. Another recent study found that AI was more effective than human reviewers at evaluating ultrasounds. If AI is faster, better, and more empathetic than humans--I mean, what's left?

And AI is really just getting started. The next five years will bring a giant pile of money into R&D to rapidly improve AI capabilities. PwC plans to spend $1B in three years on AI, and Meta (which just laid off 13% of its staff) is investing $33 billion in AI this year, and that's just the tip of the AI iceberg.

Investors won't dedicate that cash out of a desire to improve customer experiences but to seek rapid return. As with all tech advances, this either creates new markets and raises revenues or it increases margins by decreasing costs, and for most companies, labor costs are the greatest cost.

It doesn't feel as if we humans will want to lose the battle for empathy against AI. I don't mean to get all philosophical on you, but this doesn't just feel like a battle for our jobs but our souls.

We live in a world of surging division, road rage, online bullying, air rage, and mass shootings. It feels like we need to seek a renewed sense of community with greater sensitivity, concern, and care for each other. Apocalyptic sci-fi tales like Terminator saw us battle AI robots for our lives, not us battling each other and then turning to machines for comfort and empathy. The battle against AI for empathy feels like one we desperately need to win.

No comments: