Is your therapist AI? ChatGPT goes viral on social media for its role...
AI chatbots are stepping into the therapist's chair – and not everyone is thrilled about it. In March alone, 16.7 million posts from TikTok users discussed using ChatGPT as a therapist, but mental health professionals are raising red flags over the growing trend that sees artificial intelligence tools being used in their place to treat anxiety, depression, and other mental health challenges.
Users' Experiences
"ChatGPT singlehandedly has made me a less anxious person when it comes to dating, when it comes to health, when it comes to career," user @christinazozulya shared in a TikTok video posted to her profile last month. She mentioned how ChatGPT has helped her handle anxiety and provide immediate relief.
Others, like user @karly.bailey, see the platform as a valuable resource for "free therapy." Sharing her experience, she expressed how ChatGPT offers advice and journaling prompts, acting as a supportive tool in her life.
Public Perception
A study from Tebra, an operating system for independent healthcare providers, revealed that "1 in 4 Americans are more likely to talk to an AI chatbot instead of attending therapy." This shift towards AI consultants is also observed in the U.K., where individuals opt for AI mental health solutions due to long NHS wait times and costly private counseling services.
Despite the convenience and accessibility of AI chatbots, concerns have been raised about their lack of human empathy and personalized care. Dr. Kojo Sarfo emphasized that while these tools can provide support, they should not replace professional therapy when dealing with complex mental health issues.
Potential Risks
Dr. Sarfo warned against relying solely on AI for mental health advice, especially when it comes to serious conditions requiring professional intervention. While AI chatbots can aid in symptom articulation and preparation for medical appointments, they should not be seen as substitutes for trained specialists.
Experts, including Dr. Christine Yu Moutier, Chief Medical Officer at the American Foundation for Suicide Prevention, caution about the limitations of AI chatbots in handling mental health crises, particularly in suicide risk assessment. There is a call for more research and regulations to ensure these technologies are used safely and effectively.




















