Brazil: Bill to establish protections for children in digital environment and responsibilities for platforms (PL 3910/2025) including content moderation requirement was introduced to Chamber of Deputies

Description

Bill to establish protections for children in digital environment and responsibilities for platforms (PL 3910/2025) including content moderation requirement was introduced to Chamber of Deputies

On 12 August 2025, a Bill to establish protections for children in digital environments and responsibilities for platforms (PL 3910/2025), including content moderation regulation, was introduced to the Chamber of Deputies. The Bill would require providers of information-technology products or services aimed at or accessible to children and adolescents to conduct risk assessments to protect minors’ health and safety, ensure content matches age classifications, prevent access to illegal or harmful material including content promoting adultisation or sexualisation, and block monetisation of such material. Providers would have to report detected child sexual exploitation or abuse material to relevant national and international authorities and retain associated data, including uploaded content and responsible user details, for a period defined by regulation. They would also have to promptly remove content violating children’s rights upon notification without a court order. This would include material that promotes adultisation or sexualisation. Services with over one million underage users in Brazil would be required to publish semi-annual transparency reports detailing complaint channels, received reports, moderation actions, measures to detect underage accounts and illicit acts, and improvements to data protection and parental-consent verification. Penalties for non-compliance would include warnings, fines of up to 10% of revenue or per-user fines capped at BRL 50 million, temporary suspension, or prohibition of activities. Foreign companies would be jointly liable with local entities for fine payments and collected fines would go to the National Fund for Children and Adolescents.

Original source

Scope

Policy Area
Content moderation
Policy Instrument
Content moderation regulation
Regulated Economic Activity
online advertising provider, platform intermediary: user-generated content, streaming service provider, software provider: app stores, search service provider, software provider: other software
Implementation Level
national
Government Branch
legislature
Government Body
parliament

Complete timeline of this policy change

Hide details
2025-08-12
under deliberation

On 12 August 2025, a Bill to establish protections for children in digital environments and respons…