On 30 January 2026, the Companion AI Chatbot Act, including age verification requirements, was introduced in the House of Representatives. The Act would apply to any person or organisation (“covered entity”) that owns, operates, or makes available a “companion AI chatbot” to users in the United States. It explicitly excludes bots used for customer service, business operations, video games (with limited scope), and standard voice-activated virtual assistants. The Act mandates that all users of a companion AI chatbot must create an account. The operator must verify the age of every user, both existing and new, using a commercially available, accurate method. If a user is a minor, the operator must link their account to a verified parental account, obtain parental consent, block access to sexually explicit chatbots, and immediately alert the parent of any interaction involving suicidal ideation. The Act will be enforced primarily by the Federal Trade Commission (FTC), which must issue compliance guidance within 180 days and may treat violations as unfair or deceptive practices under the FTC Act. State attorneys general may also bring civil enforcement actions on behalf of their residents, subject to notifying the FTC. The FTC retains the right to intervene, be heard, transfer the action to a different court, and file appeals. A “safe harbour” provision protects covered entities from liability if they act in good faith by relying on user-provided age information, following FTC guidance, and conforming to accepted industry standards for age verification. The law will take effect one year after its enactment.
Original source