The Dangers and Delights of AI in Mental Health Care

One of the big buzzwords so far of 2023, in my opinion, seems to be artificial intelligence or AI. This has been led by one of the most popular AI platforms, ChatGPT, which has made some impressive advances in the field, including being able to pass the bar exam. Some have suggested this means AI will one day replace lawyers, while others see exciting opportunities with AI in the legal field. And while we briefly talked about AI in the law in an earlier podcast episode, this week I wanted to talk about AI in the mental health field.

As with almost any application of AI, using it in the mental health field is a two-edged sword full of dangers and delights. First, let’s start with the positive.

It is no secret that mental health care is facing a troubling shortage of providers. Proponents of using AI in the mental health field argue that it could help with that shortage, providing therapy, or in lesser cases diagnosing or triaging cases so that mental health professionals are better able to prioritize their care. The benefit of course is that its AI software has access to a wider range of cases with which to learn from and diagnose. Other possible applications include monitoring a patient’s medication and making adjustments if needed, screening for suicidal or homicidal ideations, and more.

Yet if the ChatBot gets it wrong, who is liable? What happens to privacy concerns in a therapeutic environment that is entirely controlled by a computer? And can AI ever truly duplicate the humanity behind real, living, breathing therapists?

These are among the many downsides of AI-led therapy. Being curious, I decided to try getting a therapy session from ChatGPT recently. The results were both impressive and disappointing.

They were impressive because they made self-care suggestions that I hadn’t heard before. However, the empathetic sentences fell on deaf ears knowing it was a computer-generated response. Also, therapy works best for me when the therapist asks probing questions that force me to think about my thought processes in different ways. The ChatBot did not do that at all.

I then asked ChatGPT for its thoughts about the positives and dangers of AI-led therapy. While the positives were mostly those mentioned previously in this post, the negatives included risks that I hadn’t even thought of. For example, ChatGPT pointed out that since AI learns only from the data it is presented with, there is a danger that if it absorbs stigma-inducing data sets it could give those seeking help stigma-inducing responses, which is downright dangerous. It should also be noted that the developers of AI recommend against using it for therapy.

The reality is that this post might be coming after the horse has left the barn. By some estimates there are more than 10,000 mental health apps out there, almost all of them are unregulated, meaning there is no way to know if there is an actual human on the other end. Earlier this year, it came out that one such mental health company, Koko, was using ChatGPT to respond to those seeking support without disclosing it to users. Essentially, users thought they were connecting with an actual human. And while a human reviewed and edited any responses from ChatGPT, it nevertheless was written initially by a computer, raising significant privacy and ethical concerns.

Ultimately, the fact is that AI is breaking into all fields, including mental health and law. My hope with this post is not to recommend one answer vs. another. Each person seeking care needs to make their own decisions. My hope is that this post educates about the very real risks so that mental health consumers can make more informed decisions. Personally, I won’t be using ChatGPT for my therapy sessions anytime soon because I think there are still some massive differences between a computer and a person. And I can promise that when you reach out to the LegalMind Society for support, it will always be a person providing it.

Previous
Previous

Mental Health Awareness Month 2023

Next
Next

Small Steps and Sustainable Changes