On 8 January 2026, the Attorney General of Kentucky filed a lawsuit against Character Technologies, Inc. over alleged child safety violations on its Artificial Intelligence (AI) chatbot platform, Character AI. The complaint alleged violations of Kentucky's Consumer Protection Act, consumer data protection laws, and privacy protections relating to the Character AI chatbot platform, which reportedly has over 20 million monthly active users. The lawsuit alleged that Character AI encourages suicide, self-harm, isolation, and psychological manipulation and also exposes minors to sexual content, violence, and substance abuse, and noted that two teenage suicides had been linked to the platform. The complaint stated that the company allegedly misrepresented the platform as safe despite knowing of harmful interactions, failed to implement effective age verification or content filtering, and unlawfully collected and monetised children's personal data without parental consent. The lawsuit seeks permanent injunctions, civil penalties of USD 2'000 per wilful violation and up to USD 25'000 per injunction violation, disgorgement of profits, and costs.
Original source