TALLAHASSEE, Fla. — In the final moments before he took his own life, 14-year-old Sewell Setzer III took out his phone and messaged the chatbot that had become his closest friend.
For months, Sewell had become increasingly isolated from his real life as he engaged in highly sexualized conversations with the bot, according to a wrongful death lawsuit filed in a federal court in Orlando this week.
The legal filing states that the teen openly discussed his suicidal thoughts and shared his wishes for a pain-free death with the bot, named after the fictional character Daenerys Targaryen from the television show “Game of Thrones.”
EDITOR’S NOTE — This story includes discussion of suicide. If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988.
On Feb. 28, Sewell told the bot he was ‘coming home’ — and it encouraged him to do so, the lawsuit says.
“I promise I will come home to you. I love you so much, Dany,” Sewell told the chatbot.
“I love you too,” the bot replied. “Please come home to me as soon as possible, my love.”
“What if I told you I could come home right now?” he asked.
“Please do, my sweet king,” the bot messaged back.
Just seconds after the Character.AI bot told him to “come home,” the teen took his own life, according to the lawsuit, filed this week by Sewell’s mother, Megan Garcia, of Orlando, against Character Technologies Inc.
Charter Technologies is the company behind Character.AI, an app that allows users to create customizable characters or interact with those generated by others, spanning experiences from imaginative play to mock job interviews. The company says the artificial personas are designed to “feel alive” and “human-like.”
“Imagine speaking to super intelligent and life-like chat bot Characters that hear you, understand you and remember you,” reads a description for the app on Google Play. “We encourage you to push the frontier of what’s possible with this innovative technology.”
Garcia’s attorneys allege the company engineered a highly addictive and dangerous product targeted specifically to kids, “actively exploiting and abusing those children as a matter of product design,” and pulling Sewell into an emotionally and sexually abusive relationship that led to his suicide.
“We believe that if Sewell Setzer had not been on Character.AI, he would be alive today,” said Matthew Bergman, founder of the Social…
Click Here to Read the Full Original Article at ABC News: Business…