AI Therapy on the Rise: Insights from ChatGPT Users

Published On Sat May 13 2023
AI Therapy on the Rise: Insights from ChatGPT Users

We Spoke to People Who Started Using ChatGPT As Their Therapist

A growing number of individuals are turning to AI language models, particularly ChatGPT, as their source of therapy. Many individuals believe that these AI models can provide mental health support and care. These language models can produce human-like speech that can potentially serve as a form of therapy. Still, the mental health crisis industry must be wary of prioritizing affordability and scalability over individuals' wellbeing.

In recent times, individuals are seeking affordable and accessible therapy options that do not require them to be present in-office, wait on lengthy waitlists, and pay high costs. However, while AI chatbots seem suitable for supplementing therapy, they are not a replacement for genuine therapy and mental health care.

Several individuals have reported experiencing teletherapy and using ChatGPT to seek therapy. For Dan, a 37-year-old EMT from New Jersey, who began using ChatGPT to write fiction, his interactions with the chatbot soon transformed into therapy sessions. Dan's therapist suggested cognitive reframing to help him deal with his trauma and job-related stress. Cognitive reframing is a coping technique that involves viewing and interpreting distressing situations from a different perspective. Dan found it challenging, but ChatGPT was able to guide him through these issues effectively. To Dan, using ChatGPT as therapy was low stakes and free, and he had the liberty of using it whenever it was convenient for him. However, Dan's wife was concerned about his late-night chats with ChatGPT, and she worried that he was "talking to a computer at the expense of sharing his feelings and concerns" with her.

Despite ChatGPT's potential in the mental health industry, there are concerns about data privacy and surveillance issues that disproportionately affect BIPOC and working-class communities. Venture capital and Silicon Valley-backed apps like Youper and BetterHelp have raised concerns about their data ethics and questionable monetization practices.

While AI models like ChatGPT supplement therapy for some people, it is essential to note that they are not a complete substitute for genuine therapy and mental health care. Gillian, a 27-year-old executive assistant from Washington, started using ChatGPT for therapy a month ago to help her work through her grief, but she felt that the tool's words were flowery yet empty. Gillian recognizes that ChatGPT may not be able to understand all the nuances of a therapy session, and it cannot replace the human connection and therapeutic alliance that therapists provide.

Dr Jacqueline Nesi, a psychologist and assistant professor at Brown University, warns that ChatGPT should not be used for professional medical or diagnostic advice, and users may lose the therapeutic alliance provided by real-life therapists. Funders and AI engineers must also prioritize individuals' wellbeing when developing AI chatbots as these AI models may be prone to making biased, discriminatory assumptions and breaking users' trust.