On 11 September 2025, the Federal Trade Commission (FTC) launched an inquiry into seven companies providing consumer-facing AI-powered chatbots. The FTC issued orders to Alphabet, Character Technologies, Instagram, Meta Platforms, OpenAI, Snap, and X.AI, requesting information on how these companies assess, test, and monitor potential negative impacts of their technology on children and teenagers. Companies are required to disclose revenue models, including subscriptions, advertising, and data-related income, as well as user statistics by age group, namely children (under 13), teens (13–17), minors (under 18), and young adults (18–24), covering daily and monthly active users, session duration, and engagement metrics. The inquiry also examines safety and content moderation, including pre-deployment testing, mitigation of potential harms, and systems for detecting and preventing harmful outputs, such as sexually themed content involving minors. Companies must provide information on complaint-handling procedures and user report statistics. Age verification and protection measures are reviewed, including age-gating, monitoring of underage users, and parental control features, with details on how child users are identified and safeguards activated. Data management practices must also be reported, covering collection, retention, access, sharing, deletion tools, and privacy assessments.
Original source