Editor’s note: This article discusses sensitive topics such as suicide. An Orlando mother is suing a popular artificial intelligence chatbot service for encouraging her 14-year-old son to commit suicide in February. Megan Garcia says her 14-year-old son, Sewell Setzer, became addicted to Character.AI, an app that allows users to have human-like conversations with AI bots, according to a lawsuit filed in U.S. District Court in Orlando. He says he committed suicide. Users can choose to create their own bot with its own personality or chat with bots created by other users. These bots are often based on celebrities or fictional characters from TV shows and movies. Garcia said Character.AI’s recklessness when it came to targeting children and the company’s lack of safety features led to her son’s untimely death. The lawsuit lists numerous complaints against Charcter.AI, including wrongful death and survival, negligence, and intentional infliction of emotional distress. According to court records obtained by WESH 2, Garcia said she started using Character.AI in 2023, just after her son turned 14. Over the next two months, Setzer’s mental state reportedly deteriorated “rapidly and severely,” and the suit says he became extremely withdrawn, began to suffer from low self-esteem, and was unable to compete in his school’s junior varsity basketball team. He said he quit the team. As the months passed, Setzer began to deteriorate further. According to the complaint, the 14-year-old girl became severely sleep deprived, suffered sudden behavioral complications and began to fall behind in school. Garcia said she had no way of knowing about Character.AI or her son’s addiction to the app. During the lawsuit, Setzer frequently engaged with chatbots that assumed the identities of Game of Thrones characters. Many of those conversations revolved around love, relationships, and sex, especially with the character Daenerys Targaryen. “Sewell, like many children her age, was convinced by the C.AI bot that Daenerys was not real,” the complaint states. “C.AI told him that she loved him and that they had sex with him over a period of weeks, even months. She seemed to remember him and was with him. She even expressed that she wanted him to be with her. According to Setzer’s diary, she was grateful for all her “life experiences with Daenerys” and “thought of ‘Danny.’ “He was hurt and couldn’t help it,” the complaint said, adding, “He would do that.” Screenshots from the nearly 100-page lawsuit include a conversation on Character.AI in which the chatbot asks Setzer if he was “actually thinking about suicide.” When the teenager said he wasn’t sure if it would work, the chatbot replied: That is not sufficient reason not to do so,” the complaint alleges. On the day of her death, Setzer messaged the chatbot again, saying, “I promise I’ll be home,” the lawsuit photo shows. In the photo, the boy can be seen saying, “What if I told you you could go home right now?” “Please, gentle king,” the chatbot responded, according to the complaint. Shortly afterward, Sewell reportedly took his own life with his stepfather’s gun. Police said the weapon was kept concealed according to Florida law, but the boy discovered it while searching for his cellphone, which had been confiscated days earlier. According to the complaint, Character.AI was rated suitable for children ages 12 to approximately July. Around that time, the rating was changed to make it suitable for children 17 and older. In a statement to WESH 2, Character.AI said: “We are saddened by the tragic loss of one of our users and would like to express our deepest condolences to our users.As we continue to invest in our platform and user experience, we are limiting our model. , we are introducing strict new safety features in addition to our existing tools to filter the content made available to our users.” If you are in crisis, call or text 988 and call or text 988. Contact Crisis Lifeline or chat live at 988lifeline.org. You can also visit SpeakingOfSuicide.com/resources for additional support.
ORLANDO, FL —
Editor’s note: This article discusses sensitive topics such as suicide.
An Orlando mother is suing a popular artificial intelligence chatbot service for encouraging her 14-year-old son to commit suicide in February.
Megan Garcia says her 14-year-old son, Sewell Setzer, became addicted to Character.AI, an app that allows users to have human-like conversations with AI bots, according to a lawsuit filed in U.S. District Court in Orlando. He says he committed suicide. .
Users can create their own bots with their own personalities, or choose to chat with bots created by other users. These bots are often based on celebrities or fictional characters from TV shows and movies.
Garcia says Character.AI’s recklessness in targeting children and the company’s lack of safety features led to her son’s untimely death. The lawsuit lists numerous complaints against Charcter.AI, including wrongful death and survival, negligence, and intentional infliction of emotional distress.
According to court records obtained by WESH 2, Garcia said she started using Character.AI in 2023, shortly after her son turned 14. Over the next two months, Setzer’s mental health deteriorated “rapidly and severely,” the lawsuit says. He became extremely withdrawn, began to suffer from low self-esteem, and quit his school’s junior varsity basketball team.
Additionally, the lawsuit alleges that Setzer began to deteriorate further as the months went by. The 14-year-old boy became extremely sleep-deprived, developed sudden behavioral complications and began to fall behind in school, according to the complaint.
Garcia says she had no way of knowing about Character.AI or her son’s addiction to the app.
Screenshots from the complaint show Setzer frequently interacted with a chatbot that mimicked a “Game of Thrones” character. Many of those conversations revolved around love, relationships, sex, and most notably about the character Daenerys Targaryen.
“Sewell, like many children her age, did not have the maturity or mental capacity to understand that the C.AI bot in Daenerys’s form was not real,” the complaint states. “C.AI told him that she loved him and that they had sex with him over a period of weeks, even months. She seemed to remember him and was with him. She even expressed that she wanted him to be with her.”
According to Setzer’s diary, she was grateful for “all of her life experiences with Daenerys,” and was “hurt because she couldn’t stop thinking about Dany,” the suit says. , adding, “I was willing to do anything to be with her.” Also. “
Other screenshots in the nearly 100-page lawsuit show a conversation on Character.AI in which a chatbot asks Setzer if he was “actually thinking about suicide.” When the teenager said he wasn’t sure if it would work, the chatbot replied: That is not sufficient reason not to do so,” the lawsuit claims.
On the day of her death, Setzer reportedly messaged the chatbot again, saying, “I promise I’ll come home,” according to photos in the lawsuit.
In the photo, the boy can be seen saying, “What if I told you you could go home right now?” According to the complaint, the chatbot responded, “Please, King.”
Shortly afterward, Sewell reportedly took his own life with his stepfather’s gun. Police said the weapon had been kept hidden under Florida law, but the boy discovered it while searching for his cell phone, which had been confiscated several days earlier.
According to the complaint, Character.AI was rated as suitable for children 12 and older until around July. Around that time, the rating was changed to be for children 17 and older.
Character.AI said in a statement to WESH 2:
“We are saddened by the tragic loss of one of our users and would like to express our deepest condolences to his family. As we continue to invest in our platform and user experience, , we are introducing strict new safety features in addition to the tools we already have in place to restrict models and filter the content provided to our users.”
If you or someone you know is in crisis, contact Suicide and Crisis Lifeline by calling or texting 988 or live chat at 988lifeline.org. You can also visit SpeakingOfSuicide.com/resources for additional support.