Dangers of notice-and-takedown as liability not a safe harbor in Indonesia

by | Jul 26, 2020 | Free Speech, Open Blog | 0 comments

* I was invited to speak at an event held by Indonesia E-Commerce Association on Safe Harbour Policies back on August 30, 2018. This is a write-up of my speech there.

 

The Internet’s value to people relies the intermediary liability safe harbor, an international human rights rule that the intermediaries shall not be held liable for contents that they do not know about, otherwise they will engage in prior censorship or “general monitoring”, leaving only those contents explicitly or implicitly approved by intermediaries and therefore turning the online space into a gated community controlled by the gatekeepers. In this setting, it may look innocuous to hold intermediaries liable for contents they DO KNOW the existence of or they are notified of by private parties or government bodies. However, if such liability is not limited to the contents they KNOW not just the existence of but also the illegality of, it will create many false positives because intermediaries will rather err on the side of deleting regardless of their ultimate legality.  The international standard is a liability-exempting rule that affirmatively exempts intermediaries for unknown contents, and its purpose of protecting the online space will be frustrated by a liability-imposing rule that on the surface imposes intermediary liability for the contents that intermediaries do know the existence of.  If at all, such liability-imposing rule should be limited to only those contents that intermediaries know BOTH the existence and illegality of the contents.

On December 30, 2016, the Ministry of Communication and Informatics released the Circular Letter 5 of 2016 on the Limitations and Responsibilities of Trade Platform Providers and Merchants through Electronic Commerce Systems in the Form of User-Generated Content, its first attempt at intermediary safe harbor.[1]  The ministry dubbed the circular as Indonesia’s safe harbor policy for e-commerce platforms, so it does not apply to all intermediaries but only those mediating sale of goods and services. The ministry intends to follow this circular with the Ministerial Regulation on Safe Harbor Policy for User Generated Content, which is expected to have more detail than the December circular.  This regulation has not been released.  Prior to the regulation’s release, the Ministry of Communications and Informatics has indicated it will hold public consultations to collect input.

  The circular on safe harbors aims to clearly distinguish the roles and responsibilities of e-commerce platforms, users, and all other parties in the e-commerce ecosystem. It aims to establish safety and reporting protocol for e-commerce platforms, as well as define restricted content for both users and platforms, which includes (but is not limited to) the following:  negative content (eg pornography, gambling, violence, and goods/services deemed illegal by other legislation);  intimidating content (eg goods/services depicting gore, blood, horrific accidents, and torture);  violation of intellectual property rights;  hacking and illegal access to electronic systems;  provision and/or access to drugs, addictive substances, and hallucinogenic substances;  illegal weapons;  trafficking of people and organs;  protected flora and fauna.[2]

The circular on safe harbor obligates e-commerce platforms to include a mechanism which allows users to report discovered illegal goods and services. When a platform is alerted of illegal goods and services, they are mandated to take it down within one, seven, or 14 days, depending on the severity of the content. Content deemed harmful to national security and human health must be taken down within one day, pornography within seven days, and goods/services that infringe intellectual property rights within 14 days.[3]

E-commerce platform providers are also obligated under this circular to provide clear information on which items are illegal and forbidden from being traded on their systems.[4]  Users and merchants must enter an agreement (agreeing on terms and conditions) with the platform provider prior to the use of its services.  Concerns about overuse or criminalization of the Circular Letter’s definitions of illegal content appear to be somewhat limited, considering it does not set any sanctions or penalties for noncompliance, and e-commerce platform providers are also not obligated to report illegal activities to law enforcers. However, it is very likely that illegal activities by merchants and/or users will be dealt with through criminal sanctions detailed in the Information and Electronic Transactions Law.[5]

One anomaly is that intermediaries are obliged to actively evaluate and monitor merchants’ activities on their platforms.  It is not clear whether the obligation to ‘actively evaluate and monitor merchants’ activities in their platforms’ will require providers to establish an advanced contents filtering system in their platforms. [6]

Having said that, the Indonesian ‘safe harbor’ also suffers from the confusion over the distinction between a liability-imposing rule and a liability-exemption rule.  In the Circular Letter, in Section V titled Limitation and responsibility of Platform Providers or Electronic System Providers and Merchants in Trade Through Electronic Systems (Electronic Commerce) in the Form of User Generated Content, it states as follows:

C. Obligations and Responsibilities of UGC Platform Providers

1. The obligations of the UGC Platform Provider include:

a. Presenting the terms and conditions for using the UGC Platform which at least contains the following:

1) obligations and rights of Merchants or Users to use the UGC Platform services;

2) the obligations and rights of the Platform Provider in carrying out the UGC Platform business activities;

3) Provisions regarding accountability for uploaded content.

b. Providing Reporting Facilities that can be used to submit complaints regarding Prohibited Content on the UGC Platform it manages, to obtain the least information including:

1) specific links leading to Prohibited Content;

2) reasons / basis for reports of Prohibited Content;

3) supporting evidence of the report, such as screenshots, statements, brand certificates, power of attorney.

c. Acting on complaints or reporting on content, including:

1) conduct the examination of the truth of the report and ask the reporter to complete the requirements and / or include other additional information related to the complaint and / or reporting in the event that is needed;

2) take action to remove and / or block the prohibited content;

3) give notification to the Merchant that the content uploaded is Prohibited Content;

4) provide a means for Merchants (Merchants) to argue that the content they upload is not Prohibited content;

5) reject complaints and / or reporting if the reported content is not prohibited content.

d. Pay attention to the period of removal and / or blocking of reporting Prohibited Content:

1) For Prohibited Content that is urgent is no later than 1 (one) calendar day from the report received by the UGC Platform Provider. Prohibited Content is urgent including, but not limited to:

i) Products of goods or services that are harmful to health;

ii) Products / services that threaten state security;

iii) human trafficking and / or human organs; iv) terrorism; and / or

v) other content determined by the laws and regulations.

2) Prohibited Content as stated in Roman Letter V Letter B other than urgent Prohibited Content is not later than 7 (seven) calendar days from the report received by the UGC Platform Provider;

3) Prohibited Content as mentioned in Roman Letter V Letter B number 1 letter e, that is, content related to goods and / or services that contain content that violates intellectual property rights is not later than 14 (fourteen) calendar days since the complaint and / or reporting is received by the UGC Platform Provider with supporting evidence required.

e. Evaluate and / or actively monitor the activities of the Merchants in the UGC platform.

f. Comply with other obligations established under the provisions of the legislation.

2. Responsibilities of the UGC Platform Provider include:

a. being responsible for implementing electronic systems and content in a platform that is reliable, safe and responsible.

b. The provisions of letter (a) above do not apply if there is an error and / or negligence of the merchant or platform user.[7]

The impact of Paragraph V.C.2 of the Circular Letter is not clear.  Paragraph V.C.1 states the ‘obligations’ of the platform provider while Paragraph V.C.2 states the ‘responsibilities’ which seem to refer the technical reliability of the platform.  If so, the notice-and-takedown regime discussed in Paragraph V.C.1, standing along, reads like liability-imposing, meaning that the platform operators must comply with the notice-and-takedown process and their failure to remove the prohibited content upon notification will directly translate into liability.  It is evinced by the requirement that the platform ‘conduct examination of the truth of the complaint. . . and reject the complaint if it finds the content not prohibited.’[8]  It is requiring the platform provider to get it right. Again, the risks of a liability-imposing regime is that many lawful contents will be taken due to the intermediaries’ tendencies to err on the side of deleting.

 

[1] See Indonesian Ministry of Communication, Circular Letter 5 of 2016 on the Limitations and Responsibilities of Trade Platform Providers and Merchants through Electronic Commerce Systems in the Form of User-Generated Content (2016) <https://jdih.kominfo.go.id/produk_hukum/unduh/id/558/t/surat+edaran+menteri++komunikasi+dan+informatika+nomor+5+tahun+2016+tanggal+30+desember+2016> (Indonesian only) (hereinafter Circular Letter).

[2] ibid

[3] ibid

[4] ibid

[5] See Law no 11 of 2008, Information and Electronic Transactions Law.

[6] See Kristo Molina, ‘Indonesia Implements a Safe Harbor Policy for E-Commerce (Marketplace) Platforms’ (White and Case Blog,13 March 2017) <https://www.whitecase.com/publications/alert/indonesia-implements-safe-harbor-policy-e-commerce-marketplace-platforms>.

[7] Circular Letter § V.C.1-2 (emphasis added).

[8] ibid § V.C.1.c(1-5).

 

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *