
The Online Safety Bill 2024 (the “Bill”) is set out to do exactly one thing – to enhance and promote online safety in Malaysia. In our earlier article titled “Key Impacts of the Online Safety Bill 2024”, we have discussed the key changes brought forth by the Bill. As highlighted in that article, one of the mechanisms under the Bill in ensuring the online safety in Malaysia is the introduction of an online harmful content reporting system.
Once the Bill is in force, licensed applications service providers (“ASPs”) and licensed content applications service providers (“CASPs”) in Malaysia will be required to observe a statutory obligation to provide means for users of their services to report on any content on their services which the users believe to be harmful. Upon receiving any such report, the ASPs or CASPs will have to deal with the report strictly in accordance with the manners prescribed under the Bill, failing which may expose the ASPs and CASPs to statutory fine.
In this article, we are going to examine the duties of the ASPs and the CASPs in maintaining and observing the online harmful content reporting mechanism as established under the Bill.
.
Online Harmful Content Reporting Mechanism
1. Receiving of Online Harmful Content Report
When a user reports on a harmful content, the ASPs or CASPs will have to assess the report within the timeline prescribed under the Bill. If the subject matter of the report is assessed as not harmful, or the subject matter is or has already been the subject of another report, the newly received report will then be dismissed by the ASPs or CASPs. If however the subject matter of the report is determined to be potentially harmful, the ASPs or CASPs will then have to assess if the content being reported can be considered as priority harmful content or harmful content.
2. Assessment of Online Harmful Content
Where the subject matter of a report is determined by an ASP or CASP to be priority harmful content, it will have to be immediately disabled or made inaccessible on the services operated by the ASP or CASP for a prescribed period of time. Otherwise, if it is only a “harmful content” instead of a “priority harmful content”, the ASP or CASP generally has the discretion on whether or not the content should be made inaccessible, likely based on the “extent” of harm that the ASP or CASP perceive such a content may bring. Likewise, if the ASP or CASP does decide to disable access to the harmful content, it will have to be for the period prescribed under the Bill.
At present, the Bill has categorised “harmful content” and “priority harmful content” as follows:
(a) Harmful Content
- • Content on child sexual abuse material as provided for under section 4 of the Sexual Offences against Children Act 2017;
- • Content on financial fraud;
- • Obscene content including content that may give rise to a feeling of disgust due to lewd portrayal which may offend a person’s manner on decency and modesty;
- • Indecent content including content which is profane in nature, improper and against generally accepted behaviour or culture;
- • Content that may cause harassment, distress, fear or alarm by way of threatening, abusive or insulting words or communication or act;
- • Content that may incite violence or terrorism;
- • Content that may induce a child to cause harm to himself;
- • Content that may promote feelings of ill-will or hostility amongst the public at large or may disturb public tranquillity; and
- • Content that promotes the use or sale of dangerous drugs.
(b) Priority Harmful Content
- • Content on child sexual abuse material as provided for under section 4 of the Sexual Offences against Children Act 2017;
- • Content on financial fraud;
In the event an ASP or CASP does proceed to disable the access to a priority harmful content or harmful content for the period prescribed under the Bill, the ASP or CASP will have to reevaluate or reaffirm its decision pertaining to the disabled content during the period prescribed under the Bill. If pursuant to the re-evaluation the ASP or CASP maintains that the content is either a priority harmful content or harmful content, it will have to permanently disable the access to such content on its service. On the other hand, if the ASP or CASP determines that the content is neither a priority harmful content nor harmful content, the ASP or CASP will then have to resume the access to the content on its service.
3. Request for Inquiry
The maker of the report or the maker of the content (as the case may be) who is aggrieved by the decision of an ASP or CASP can formally request that the relevant ASP or CASP inquire into its action. Essentially, in the face of such a request, an ASP or CASP will have to review its decision vis-à-vis an online harmful content to determine if it wishes to change its decision. At this stage, the outcome of the ASP or CASP will be final. If a user wishes to further challenge the decision of the ASP or CASP, it will have to report the matter to the Malaysian Communications and Multimedia Commission.
For a flowchart illustration of the online harmful content reporting mechanism, please see below:
.
Assessing an Online Harmful Content Report
As elaborated in the earlier section of this article, assessing whether or not a piece of content is a priority harmful content or harmful content may not be an entirely easy feat. While the Bill does provide listings of content to be considered harmful, ASPs and CASPs will still have to exercise their own discretion to assess whether a reported content falls within the listings of content prescribed under the Bill. Considering that the ASPs and CASPs are required to act on an online harmful content report swiftly and timely in accordance with the timeline prescribed under the Bill, it would be crucial for there to be a predetermined guiding principle or internal policy document that could aid the content moderating teams of the ASPs and CASPs in determining whether contents are harmful. The guiding principle or internal policy document should also set out the relevant timeframe that the content moderating teams should adhere to when dealing with an online harmful content report so that compliance with the Bill can be achieved.
If you would like to know more about the Online Safety Bill 2024, you may reach out to the partners at the Technology Practice Group of Halim Hong & Quek for further enquiries. The Technology Practice Group frequently work with software and tech companies on their compliance matters, deployment, projects and regulatory affairs. The team is well equipped with the skill set and expertise to assist on your next initiative.
About the authors
Lo Khai Yi
Partner
Co-Head of Technology Practice Group
Technology, Media & Telecommunications (“TMT”), Technology
Acquisition and Outsourcing, Telecommunication Licensing and
Acquisition, Cybersecurity
ky.lo@hhq.com.my.
.
Ong Johnson
Partner
Head of Technology Practice Group
Technology, Media & Telecommunications (“TMT”),
Fintech, TMT Disputes, TMT Competition, Regulatory
and Compliance
johnson.ong@hhq.com.my
More of our Tech articles that you should read: