Requiring companies to obtain licenses is seen as a direct attempt to exert control over social media platforms.
This statement was originally published on CIJ Facebook page on 29 July 2024.
ARTICLE 19 and the Centre for Independent Journalism (CIJ) are deeply concerned about the Malaysian government’s recent announcement that social media companies will be required to obtain licences under the Communications and Multimedia Act (CMA) 1998. The new regulatory framework will be introduced on 1 August 2024, with enforcement effective 1 January 2025. This development is seen as a direct attempt to exert control over social media platforms, which could have far-reaching implications for freedom of expression, as guaranteed in the Federal Constitution in Malaysia.
Furthermore, there is a growing apprehension that such regulatory measures could pose a significant threat to the fundamental democratic values that underpin the nation’s governance and the underlying principle of CMA Section 3(3), which states that “nothing in the CMA shall be construed as permitting the censorship of the internet”. Civil society organisations (CSOs), including ARTICLE 19 and CIJ, which have been previously engaged for consultation, have expressed our concerns to the government regarding the potential imposition of licensing on social media platforms to moderate harmful content. We have advised the government against hasty decision-making and emphasised the need for thorough consideration of the implications and stakeholders involved.
Additionally, on 27 June, 2024, the CSOs issued a letter to the Prime Minister urging the government to prioritise increased collaboration and consultation with civil society organisations and other relevant stakeholders. Their message highlighted the importance of inclusive and transparent processes in shaping policies related to social media regulation.
Overreach of licensing framework
The licensing system for network and application services faces a significant challenge due to difficulties in anticipating future needs and developments, and a notable lack of independent oversight, which can impact the fairness and transparency of the licensing process. This lack of clear guidelines and oversight has created uncertainties for social media platforms.
Consequently, these platforms may need to meet specific regulatory requirements and adhere to standards set by regulatory authorities as part of the licence renewal process. This would involve a closer working relationship between the platforms and the regulatory bodies, ensuring that they operate according to the requirements outlined in the licensing framework. As a result, platforms could be more compliant and consent to more removal requests from the government instead on focusing on effective and timely content moderation.
It is important to note that lack of transparency in the compliance process gives large platforms even more power to police what we see, say, and share online — with disastrous consequences for public debate, the free flow of information, and democracy. Social media networks are a vital space for us to connect, share, and access information.
ARTICLE 19, in a legal analysis of the CMA, repeatedly warned that some of the provisions under the CMA are problematic and not in line with international human rights standards. The more new regulations are in place, the more power the Malaysian Communications and Multimedia Commission (MCMC) has to regulate content and social media companies. We have repeatedly raised the issue of using sections 211 and 233 of the CMA to define harmful content. At the same time, the provisions have been abused over the years to restrict freedom of expression. In principle, we reiterate that Sections 211 and 233 of the CMA should be repealed, as they have an expansive scope and can be subject to vague interpretation. The provisions also do not meet the international freedom of expression standard, especially the three-part test: legitimate aim, provided by law, proportionate, and necessary.
Platform accountability
ARTICLE 19 and CIJ understand the government’s intent to hold social media platforms and messaging application accountable as means of tackling online abuse, hate speech, and other problematic content, including scams and fraud, that target children in particular, as well as others using online platforms.
One important step is to ensure the social media platforms to (i) enhance their community standards and guidelines to meet international human rights standards, including on data protection, privacy and transparency on use of artificial intelligence (AI); and (ii) ensure that their content moderation and removal policies and actions are effective and timely, done in transparent and systematic ways, without personal, political or business biases. Social media platforms will have to invest in adequate human and language detection resources to go beyond automated flagging or using AI to detect harmful content.
Thus, the government will have to adopt innovative and alternative means of holding these platforms accountable, as any attempts to incorporate these platforms into a more traditional regulatory regime are unlikely to be effective and may have unforeseen implications given the rapidly growing nature of technology and the global reach of these platforms. Any attempts to hold the platforms accountable must ensure that there is meaningful protection of the rights of the public, including by not infringing on the users’ freedom of expression.
Way forward
It is essential to address the lack of transparency regarding the specific requests the MCMC or other government entities makes of the platforms, and their responses to these requests. The government should avoid unnecessarily regulating online content moderation and licensing social media platforms. Any regulatory framework for social media platforms must be based on principles of transparency, accountability, and the protection of human rights. This should include requirements to enhance transparency in content moderation decisions and to improve systems for resolving disputes arising from these decisions.
It is recommended that the government adopt the following:
- Establish a social media council that would promote a multi-stakeholder independent regulatory framework;
- Set up an independent committee to review the root causes of hate speech and cyberbullying, and relatedly develop a comprehensive plan of action, using the Rabat Plan of Action as the framework; and
- Enhance its education and awareness programmes aimed at building a resilient society guided by ethical and responsible content-creating standards, and with adequate digital literacy to combat the dangers of harmful content.
In conclusion, to achieve better results in countering harmful social media content and protecting users, the government must reconsider its current plan and consult more comprehensively with CSOs. This is necessary because effectively addressing harmful content goes beyond just content moderation; it also entails addressing the root causes of issues such as hate speech, cyberbullying, and gender-based violence. Engaging with CSOs can provide insights into the broader societal and systemic problems that contribute to harmful content and help develop more holistic and effective strategies for mitigating these challenges.