Compare with different regulatory event:
On 3 January 2023, the "bill to protect the safety of children on the internet" (S. 3663) was rejected due to not being adopted by 117th Congress by the end of the legislative session. The Bill requires every internet platform to mitigate the risks of harm posed to minors by the materials on the platform. Therefore, the platforms shall provide settings to limit the ability of other individuals to contact a minor, protect the minor's personal data visibility, limit features that incentivise the use of the platform by a minor (media automatic playing, gamification features and notifications), opt-out of algorithmic recommendation systems that use a minor’s personal data and limit the use or remove the minor's personal data from the platform. If the platform knows that the user is a minor, it should apply by default the strongest possible options. Moreover, each platform should provide parental controls to supervise the minors' activities, including the ability to control privacy settings, restrict purchases and track the time spent by the minor on the platform. Moreover, any platform should establish an electronic point of contact to allow the submission of reports of harm to a minor. At least once a year, the platforms should publish a report identifying the risks of harm to minors based on an independent, third-party audit. In addition, the platforms shall allow a selected researcher to access their data to pursue independent research on the potential harms to minors. Finally, the Federal Trade Commission shall establish guidelines for platforms' market research on minors and the National Institute of Standards and Technology shall conduct a study to assess the most feasible device technologies to verify age.
Original source