On 4 December 2024, the National Information Technology Development Agency (NITDA) released an analysis of compliance with laws and misinformation management by social media platforms. The Code of Practice for Interactive Computer Service Platforms and Internet Intermediaries requires platforms with over one million users to submit annual compliance reports in alignment with Part II, Section 10 of the Code. This regulation promotes transparency and adherence to regulatory standards, aiming to enhance user safety and accountability in Nigeria’s digital environment. Compliance measures, including account deactivation, content removal, and user complaint handling, are integral to this framework. By ensuring that platforms follow these processes, the Code seeks to safeguard user interests and reduce exposure to harmful content. The analysis of 2023 compliance reports highlights the different approaches adopted by platforms. Google emphasised proactive content moderation and responsive complaint handling. LinkedIn maintained strict professional guidelines and promptly removed content that violated policies. TikTok collaborated with stakeholders and implemented stringent moderation policies. X prioritised transparency, policy enforcement, and user engagement. These efforts led to the deactivation of over 12 million accounts and the removal of 65 million harmful content pieces. However, the report notes further improvements are needed in content moderation and user protection. Recommendations include establishing a collaborative task force between the government and platforms, implementing crisis response protocols, enhancing AI-driven content moderation, launching public awareness campaigns, and supporting local digital safety initiatives.
Original source