Open Net’s contribution to a Freedom Online Coalition seminar on UNESCO Internet for Trust recommendations

by | Jun 22, 2023 | Free Speech, Open Blog, Press Release | 0 comments

Freedom Online Coalition had an Advisory Network meeting on February 13, 2023 (EST) where the members of the Advisory Network advised the Canadian government, the chair country of Freedom Online Coalition, on UNESCO’s recent “Internet for Trust” recommendations.

K.S. Park, Director of Open Net, an organizational member of the Advisory Network, spoke to the audience of about 30 government reps and AN members as follows:

UNESCO’s recent recommendations seem to condone and encourage not only the NetzDG-type mandatory notice and takedown but also administrative censorship whereby administrative bodies can order platforms to take down contents. This is not surprising given that UNESCO missed Asia in its regional consultations when Asia, especially Southeast Asia is the region that suffers most from mandatory notice and takedown and administrative censorship. Viet Nam, Myanmar, and Indonesia were inspired by NetsDG to institute mandatory notice and takedown but added administrative agencies as the notice givers.

Notice and takedown is originally part of a safe harbor given to the intermediaries designed to protect them from liability for user contents that they are not aware of: As long as a platform, upon receipt of a notice on an illegal content, immediately takes down, the platform will not liable for that content or any other content that they were not aware of. Mandatory notice and takedown is a rule that a platform WILL BE liable if it fails to take down content upon notice. The former is liability-exempting while the latter liability-imposing.

Technically, it does not directly violate the intermediary liability safe harbor rule since it imposes liability only upon the contents noticed by platforms. The problem with mandatory notice-and-takedown is that, even if the takedown obligation is imposed only on illegal content — for instance, NetzDG applies only to the information mediating the acts condemned in the German Criminal Code —- it incentivizes platforms to err on the side of taking down as opposed to retaining the notified-upon contents and end up taking down lawful contents because of asymmetry in incentives: A platform is held liable for carrying contents but is hardly held liable for not carrying contents.

Many argue that as long as contents are illegal and noticed, there is nothing wrong with holding platforms jointly liable for intentional refusal to take them down. However, general torts liability or criminal accessory principles are sufficient to force the platforms to act. What the added statutory imposition of intermediary liability does is that it strengthens the asymmetric incentive to err on the side of taking down. Intermediary liability originally arises from the harms or culpability of the contents and the intermediary’s role in allowing the harms/culpability to materialize.

The mandatory notice-and-takedown law adds another layer of liability arising from the intermediary’s act of refusing to take down. For instance, without mandatory notice-and-takedown law, the intermediary may be held jointly liable in modest amounts of damages as a contributory infringer for allowing posting of a simple photo of marginal economic consequences. With the law, the intermediary will be additionally held liable in unpredictable amounts/severity for the failure to take down. What is more, mandatory notice-and-takedown laws do not distinguish knowledge of existence of contents as opposed to knowledge of illegality of contents. General torts liability usually requires knowledge and the mandatory notice-and-takedown system does not.

Administrative censorship causes “chilling effect” on free speech. The state does have sovereign power to favor certain policies and execute them. However, the state must be neutral on people’s discourse on those policies. Administrative censorship opens up new breadths of censorship. Also, the actual manifestation of administrative censorship comes with additional problems on the top of the theoretical problem I just mentioned. The standard of administrative censorship is usually very broad. South Korea had mandatory notice-and-takedown since 2008 under the mandate of doing “what is necessary for nurturing sound communication ethics”. Even if the standard is clearly defined like NetzDG in text, administrative agency’s decision dominates discourse, whose discretion can be controlled only by post-publication judicial review, ending up with “chilling effects”.


Submit a Comment

Your email address will not be published. Required fields are marked *