Getting your Trinity Audio player ready...
|
The boy, 14, spent months talking to an artificial character on Character.AI before becoming depressed. His mother blames the company, saying the tech shouldn’t be accessible to minors.
Character AI, a Menlo Park, California-based startup, describes itself as having a mission to “empower everyone globally with personalized AI.” Its system offers users the chance to chat with character-based AI personas from genres including anime, traditional “digital assistants,” or even old-school private eyes. And if you don’t like what’s on offer, you can create your own custom chatbot—choosing their “voice, conversation starts, their tone” and more. The company is now in the spotlight for one of these user-generated characters, named after the Game of Thrones character Daenerys Targaryen, which is linked to a 14-year-old from Florida who died by suicide after talking with the artificial persona for several months.
ABC7News reports that the boy, Sewell Setzer III, had been talking with the chatbot for some time, and his mother stated that even though he knew it was not a real person, he “became emotionally attached” to the digital personality and then “sank into isolation and depression before taking his own life.” The New York Times says that Setzer had been chatting to the bot dozens of times per day, and their interactions had escalated to the point of exchanging romantic and sexual content. He was talking with the bot moments before his death, and had previously indicated he’d had suicidal thoughts, the Times says.
The boy’s mother, Megan L. Garcia, is now suing Character AI. Garcia is reportedly seeking to hold the chatbot maker and its founders, Noam Shazeer and Daniel De Freitas, responsible for her son’s suicide and is asking for unspecified damages. News site Decrypt.co explains the suit alleges Character AI “chose to support, create, launch, and target at minors a technology they knew to be dangerous and unsafe.” The suit also targets Google and Alphabet. Google rehired Character’s two founders (who’d previously left the tech giant in 2021 to start Character) in August as part of a deal that saw Google licensing the startup’s chatbot technology. The deal was worth $2.7 billion.
What makes the legal case over Character AI’s technology complicated is that there aren’t other human users involved, and Character’s system may not rely on the same allegedly addictive algorithmic tricks that other social platforms use to keep users engaged.
Character made a statement on the matter in an X posting, noting “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family.” As a company, it says it takes “the safety of our users very seriously and we are continuing to add new safety features,” the post continued, linking to a blog that says “our policies do not allow non-consensual sexual content, graphic or specific descriptions of sexual acts, or promotion or depiction of self-harm or suicide.” It explains that it’s also “continually training the large language model (LLM) that powers the characters on the platform to adhere to these policies.”
Setzer’s death is a tragedy, but all questions of legal responsibility will have to wait until they are thrashed out in court—the technology in question here is, after all, very new.
It is worth remembering that chatting and sharing platforms, mainly in the form of traditional social media systems like Meta’s Instagram service, have been in the headlines for years now due to allegations of links to the ongoing teenage mental health crisis. In mid-October, for example, it emerged that Meta was set to face two lawsuits over its alleged impact on young users’ mental well-being.
The story isn’t a cautionary tale for all AI companies, or for third parties that use or license AI tech. But it is a reminder that chatbot technology, and similar AI tech, is in its infancy, and there are inevitably going to be complications and mistakes. It is also a reminder that if your company’s services are accessible to younger people, you may find that your systems for protecting these vulnerable users fall under intense scrutiny.
If you or someone you know is experiencing suicidal thoughts or is in crisis, know that you can seek help from the 988 Suicide and Crisis Lifeline by dialing 988. It’s open 24 hours a day and is free and confidential.
We want our community to be a useful resource for our users but it is important to remember that the community is not moderated or reviewed by doctors and so you should not rely on opinions or advice given by other users in respect of any healthcare matters.
Always speak to your doctor before acting and in cases of emergency seek appropriate medical assistance immediately. Use of our community is subject to our Terms of Use and Privacy Policy and steps will be taken to remove posts identified as being in breach of those terms.