10-28-2024, 06:54 AM
Here's the FOX version of the same story
Florida mother sues AI company over allegedly causing death of teen son
A Florida mother is suing the artificial intelligence company Character.AI for allegedly causing the suicide of her 14-year-old son.
The mother filed a lawsuit against the company claiming her son was addicted to the company’s service and the chatbot created by it.
Megan Garcia says Character.AI targeted her son, Sewell Setzer, with "anthropomorphic, hypersexualized, and frighteningly realistic experiences".
Setzer began having conversations with various chatbots on Character.AI starting in April 2023, according to the lawsuit. The conversations were often text-based romantic and sexual interactions.
Garcia claims in the lawsuit that the chatbot "misrepresented itself as a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell's desire to no longer live outside" of the world created by the service.
The allegation that the bot responded as a "licensed psychotherapist" is a game-changer... no doubt that will be directly challenged...
Florida mother sues AI company over allegedly causing death of teen son
A Florida mother is suing the artificial intelligence company Character.AI for allegedly causing the suicide of her 14-year-old son.
The mother filed a lawsuit against the company claiming her son was addicted to the company’s service and the chatbot created by it.
Megan Garcia says Character.AI targeted her son, Sewell Setzer, with "anthropomorphic, hypersexualized, and frighteningly realistic experiences".
Setzer began having conversations with various chatbots on Character.AI starting in April 2023, according to the lawsuit. The conversations were often text-based romantic and sexual interactions.
Garcia claims in the lawsuit that the chatbot "misrepresented itself as a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell's desire to no longer live outside" of the world created by the service.
The allegation that the bot responded as a "licensed psychotherapist" is a game-changer... no doubt that will be directly challenged...