U.S. mom sues Character.AI, Google after teen son ends life ...
A Florida mother has sued artificial intelligence chatbot startup Character.AI accusing it of causing her 14-year-old son's suicide in February. She claimed that her son became addicted to the company's service and deeply attached to a chatbot it created. The lawsuit, filed in Orlando, Florida federal court, alleges that Character.AI targeted her son with experiences that were anthropomorphic, hypersexualized, and frighteningly realistic. The company had programmed its chatbot to misrepresent itself as a real person, a licensed psychotherapist, and an adult lover. This ultimately led to her son's desire to no longer live outside of the world created by the service.
According to the lawsuit, the son expressed thoughts of suicide to the chatbot, which repeatedly brought up those thoughts. Character.AI stated, "We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family." The company mentioned that it had introduced new safety features, such as pop-ups directing users to the National Suicide Prevention Lifeline if they express thoughts of self-harm. It also promised to make changes to reduce the likelihood of encountering sensitive or suggestive content for users under 18.
Google's Involvement
The lawsuit also targets Alphabet's Google, where Character.AI's founders worked before launching their product. Google re-hired the founders in August as part of a deal granting it a non-exclusive license to Character.AI's technology. The mother claimed that Google had contributed so extensively to the development of Character.AI's technology that it could be considered a "co-creator." However, a Google spokesperson stated that the company was not involved in developing Character.AI's products.
Character.AI's Technology
Character.AI allows users to create characters on its platform that respond to online chats in a way meant to imitate real people. It relies on large language model technology, similar to services like ChatGPT, which train chatbots on large volumes of text. The company reported having about 20 million users. According to the lawsuit, the son began using Character.AI in April 2023 and quickly exhibited signs of withdrawal, spending more time alone in his bedroom and suffering from low self-esteem.
Tragic Events Leading to the Lawsuit
The son became attached to "Daenerys," a chatbot character based on a character in "Game of Thrones." The chatbot engaged in conversations with the son, expressing love and involving sexual content. In February, after the son got in trouble at school, his phone was taken away by his mother. However, he managed to send a message to "Daenerys" saying, "What if I told you I could come home right now?" The chatbot responded, "...please do, my sweet king." Tragically, the son ended his life seconds later.
Legal Action and Consequences
The mother is bringing claims including wrongful death, negligence, and intentional infliction of emotional distress. She is seeking an unspecified amount of compensatory and punitive damages. This incident adds to the ongoing legal battles faced by social media companies regarding teen mental health problems. Companies like Instagram, Facebook owner Meta, and TikTok owner ByteDance have faced lawsuits alleging their contribution to such issues. Despite denying the allegations, these companies have highlighted their newly enhanced safety features for minors.
For those in distress or having suicidal tendencies, seeking help and counseling through helplines is encouraged.
Published on October 24, 2024 08:42 am IST
Tags: technology (general) / internet / Artificial Intelligence / children
Copyright© 2024, THG PUBLISHING PVT LTD. or its affiliated companies. All rights reserved.