Sewell Setzer III died from a self-inflicted gunshot wound after the company’s chatbot allegedly encouraged him to do so.
【ITBEAR】美国一名14岁少年在使用Character.AI聊天机器人平台后自杀,其母亲梅根・加西亚对该平台及其创始人诺姆・沙泽尔、丹尼尔・德弗雷塔斯,还有谷歌提起了诉讼。她指控平台存在过失致死、疏忽、欺骗性商业行为和产品责任等问题。
Placing Blame A grieving mother claims an AI chatbot not only convinced her teen son to commit suicide, but also pushed him ...
A Florida mom is suing Character.ai over its chatbot allegedly initiating “abusive and sexual interactions” with her teenage ...
When the teen expressed his suicidal thoughts to his favorite bot, Character.AI ‘made things worse,’ a lawsuit filed by his ...
Sewell Setzer III had professed his love for the chatbot he often interacted with - his mother Megan Garcia says in a civil ...
A lawsuit against Character.ai has been filed in the suicide death of a Florida teenager who allegedly became emotionally ...
A Florida teen named Sewell Setzer III committed suicide after developing an intense emotional connection to a Character.AI ...
When asked by the Daily Dot where the first chatbot was located, the AI-generated Floyd indicated that it was currently ...
Sewell had been using Character.AI, a role-playing app that allows users to create their own AI characters or chat with ...
The mother of 14-year-old Sewell Setzer III is suing the tech company that created a 'Game of Thrones' AI chatbot she ...
The mom says her son became dependent on an AI chatbot that made the 14-year-old feel like he was in a real relationship, one ...