On 1 January 2018, the Act to Improve Enforcement of the Law in Social Networks (Network Enforcement Act/NetzDG) including content moderation regulation, entered into force. The Act requires online platforms with over 2 million users in the Federal Republic of Germany displaying user-generated content to implement a flagging mechanism to enable users to report unlawful content, check the reported content and assess whether it is unlawful and subject to removal or access restrictions. The content that includes offences outlined in the Criminal Code, such as dissemination of propaganda material and symbols of unconstitutional and terrorist organisations, preparation of serious violent offences endangering the state, depictions of violence, and revilement of religious faiths, are considered unlawful content. The online platforms are required to remove or block access to unlawful content within 24 hours or 7 days or more if it has reached an agreement with law enforcement authorities. The platforms are required to store the data on the content removed for law enforcement purposes and notify the user that reported the content and the user which content was deleted or blocked. The Act requires platforms to issue every 6 months to report on the measures taken to remove unlawful content publicly if they receive more than 100 complaints per calendar year. Finally, platforms must appoint a person that is required to respond to notifications and requests from law enforcement authorities within 48 hours.
Original source