Replacing Big Techs with Big Brother: IIC Webinar 2/3/2021

by | Feb 23, 2021 | Free Speech, Open Blog | 0 comments

Presentation Points of Kyung Sin Park:

  • Who or what does safe harbour protect, and why? And does this remain valid given the impacts of intermediaries in recent years?

Safe harbors protect all digital intermediaries who provide service at the request of consumers, that is the definition of E Commerce Directive of 2000. They include telcos (conduits), web hosts, platforms, etc., who provide virtual space for people to communicate one another. Both e commerce directive and American laws (Communication Decency Act Section 230, Digital Millennium Copyright Act Section 512) protect from liability for contents they are not aware of. The reason is that if they are held liable for unknown contents, they will check all contents before or after being uploaded. This means that whatever is online will be the result of editorial decisions of the platform operators, which renders moot the civilizational significance of the internet, which is that each individual can become an agent of mass communication, not just TV or newspapers who tend to be influenced by political forces or advertisers.

  • There are arguments for and against safe harbour for intermediaries. Why is there so much misunderstanding around safe harbour?

I talked mostly about for-arguments. Against-arguments mainly concern CDA 230 which seem to protect intermediaries even for contents they know to be illegal: defamatory contents, revenge pornos, and other horrible contents that some platforms have refused to erase. However, safe harbors are misunderstood. There are CDA 230 cases that where they engage in editorial functions (e.g., promoting certain illegal contents), they are deemed publishers. Knowledge can be imposed. Knowing facilitation is imposed liability. Of course, CDA 230 is Good Samaritan law and therefore immunizes platforms from disadvantages for taking DOWN contents, meaning that their past takedown will not be deemed evidence of their knowledge of or capacity for taking down future illegal content.  However, editing UP the contents (i.e., promoting contents) does not have such immunizing function and knowing facilitation does impose liability on platforms.

FOSTA, the most recent amendment to CDA 230, adds an explicit liability for knowing facilitation of sex trafficking. What is wrong with codifying judicial interpretation?  EFF argues that service providers would be required to proactively take action against sex trafficking activities, and would need a “team of lawyers” to evaluate all possible scenarios (which may be financially unfeasible for smaller companies) or would result in over-inclusion.

  • What should an intermediary be responsible for, and what should they not be responsible for? How broad or narrow should Safe Harbour be? Where – but more importantly – how do you draw the line?

I believe that they should be held responsible for what they know, should not be held responsible for what they are not aware of. Most importantly, they should not be held liable for what they know existence of but do not know illegality of. This is the line of attack on Germany’s Network Act, Australia’s Abhorrent Violent Material Act, and France’s now struck down AVIA law. We think that All the three laws impose liability for failure to take down illegal content but many times platforms, having to make the decision in 24 hours, will have to take down many lawful contents. Korea’s provisional takedown law, which also requires on-demand takedown, has resulted in takedown of many lawful contents. There must be a clear statement that knowledge must be not just of existence of the content but also its illegality. This is consistent with the traditional accomplice liability whereby the person who did not engage in the illegal act himself/herself is considered an accomplice only when he has knowledge of what he or she is aiding and abetting.

  • Should the level of control exercised by an intermediary over content be a factor? Should an intermediary be truly “neutral” to avail itself of safe habour, or does that just create more problems?

This comes from people’s recent critique of Facebook’s deplatforming of Trump’s account in relation to Capitol Hill attack in January 2021.  People were wondering whether Facebook should be neutral and not exercise such control on what people see.

Well, I think that people have short memories. In June 2020, Trump said ‘when looting starts, shooting starts’, Facebook was criticized for not taking down that. What is the difference?

Yes, deplatforming is broader as a remedy than taking down specific content but Facebook should be allowed to not become part of the discourse. Twitter should be allowed to shape its own business model. Twitter has a family-friendly . These platforms should be allowed to compete.

Also, another aspect is that we may invite government censorship. Merkel said we should not allow the techs to make decisions. Eric Barent also recommended public censorship. Can democratically elected bodies be really neutral? Or will they serve the majoritarian desires? Remember the American FCC’s fairness doctrine in broadcasting which was later dropped because FCC was abused to serve successive regimes. Also, administrative censorship causes the legal problem of “collateral bar”, which is really a double liability problem. What I mean is that platforms not removing contents will be liable not just for contributing to dissemination of unlawful content but also for not following government orders. This will cause over-inclusive censorship on platforms.  Korea had exactly that experience: KCSC’s takedown of newspaper boycotters who organized themselves over the internet.

Internet has so much content that it is not SCALABLE, it is selection of most rabid censorial people that obtain salvation for their professed mission of sanitizing the communication ecosystem. 9 people, given public positions to say something, will also say something most conservative about the content.  For instance, KCSC took down explicit male body parts, which was completely legal under the law.

This call for neutral platforms and the call for public control will replace Big Tech with Big Brother, which is the reason why France’s AVIA law was struck down as unconstitutional. What is more, compliance cost will actually strengthen the oligopoly, maintaining the incumbent Big Techs on top of it.

Q: What are your thoughts on increasing tendency to dilute safe harbour for intermediaries via regulatory interventions,  what policy approaches are needed to ensure continuous flow of  innovation and  investments.

Kyung-Sin Park

Interventions come from 2 angles. identifying liability-imposing rules like Germany, France, and Australia. i talked about the overinclusion problem of this angle. Another is creation of government content moderation body. again, i talked about neutrality problem and double liability problem in presentation.

Q:

Professor Park, WHO should check the content? Somebody has to. Even if just to alert the platforms, yes? So, WHO should be doing the checking?

Kyung-Sin Park

I think that in the end, courts are the best deliberator. Many point out the scalability problem but if you see how Google responded to just one seminal case of Costeja on right to be forgotten. it works!

Q:

The follow up question is of course HOW does a 3rd party – say, a regulator – how do they check the phenomenal amount of content. (If you don’t want the platforms responsible for the checking.)

Kyung-Sin Park

Exactly. That is what I wanted to talk about but couldn’t because of time. There is a scalability problem. When I was a commissioner of KCSC, we deliberated upon 2,000 URLs in 15 minutes. Also, what happens is that there are so much content online that monitoring just doesn’t work and have to rely on complaints, and the most censorial part of the population submit complaints and the regulator ends up providing a battlefield for their moral crusade.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Recents