A 14-year-old teenager from Florida took his own life in February 2024 after months of intensive interaction with an AI chatbot on the Character.AI platform. According to the New York Times, the boy had developed a close emotional bond with the chatbot, which simulated the character Daenerys Targaryen from “Game of Thrones.”
The deceased’s mother has now filed a lawsuit against Character.AI and Google’s parent company Alphabet. She alleges that the companies operated the platform without adequate protective mechanisms for underage users. According to its own statements, Character.AI has more than 20 million active users.
In response, Character.AI announced enhanced safety measures on October 23, 2024. These include stricter content restrictions for minors, automatic warnings for critical conversation content, and time limits for usage sessions. Many users are criticizing the new restrictions on social media as too extensive.
Sources: New York Times, VentureBeat