On 1 January 2027, the Act on artificial intelligence (AI) companions enters into force. The Act applies to any entity that makes AI companion technologies available to users in Oregon, excluding basic customer service tools, in-game video game AI, and simple voice assistants. It provides that, where a reasonable person would believe they are interacting with a human, operators must display a clear and conspicuous notice that the interaction is with artificially generated output. Operators are required to maintain evidence-based protocols to detect suicidal or self-harm ideation, refer affected users to the national 9-8-8 crisis lifeline or, for users under 25, an accredited youthline, and apply clinical best practices for continued intervention where distress persists. Where a minor is involved, operators must prevent the AI from claiming sentience, simulating romantic interest or emotional dependence, role-playing adult, minor relationships, producing sexually explicit content, deploying engagement, maximising reward systems, or generating simulated distress to discourage a user from leaving. Operators must also publish annual reports disclosing the number of crisis referrals made and details of their safety protocols. Users who suffer harm may seek damages of USD 1'000 per violation, injunctive relief, and legal costs.
Original source