Rightscon: Building alliance between CSOs and Big Techs in defense of democracy

by | Jun 13, 2023 | Uncategorized | 0 comments

Open Net co-hosted a Rightscon session on June 8, 2022, with Google APAC on the need and methodology for an alliance between big techs and civil society organizations to counter misinformation and disinformation designed to hamper democracy and shrink civic spaces.

  1. Background: NetzDG passed in 2017 first established that failure to take down on demand translates directly into liability where it is not the harm of the content that drives liability but the failure to take down on demand.  Usually intermediary liability is accessory liability and therefore failure to take down on demand did not translate directly into liability.  It required more analysis of harm of the content and the actor’s knowledge of that harm. Accessory liability, unlike main actor’s liability, requires knowledge of the harmfulness of the content.  If you run over someone with a car, the person supplying the car will be liable only when the car owner has knowledge of this possibility while you the driver will be liable even without knowledge. Now, NetzDG changed all of that: a platform operator can be liable immediately for failure to take down on demand regardless of the knowledge of the content harm. This generates many false positives since the platforms would rather err on the side of deleting. We should be reminded that notice-and-takedown was originally invented as a safe harbor: liability-exempting, and not a mandatory rule of imposing liability.  
  2. Butterfly effects in Southeast Asia: A few years later, Indonesia, Singapore, Thailand, Myanmar, and Viet Nam  passed/are passing the laws (mal)adapting NetzDG to create online non-judicial censorship whereby the ICT ministries are repurposed to now be able to issue takedown orders to platforms but with more invasive elements: the time intervals are really short (within 4 hrs Indonesia, within 3 hrs Viet nam); and the standards of takedown are either very vague or clearly overbroad:  “prohibited content”, “against the State”, etc. The effect of vagueness can be extrapolated from the experience of Korea where websites such as womenonweb.org are blocked by the religious right-dominated censorship regime under the standard “necessary for nurturing sound communication ethics”.  These maladaptations intensify the platform’s tendency to err on the side of deleting as opposed to retaining postings, causing more false positives. In Indonesia, information on election candidates is disappearing and many platforms receive 24 hour notices to takedown.
  3. Need for tech-CSO alliance: We the civil society organizations in Southeast Asia will continue to challenge the Big Techs not to take down those contents.  There is room for fighting.  Although knowledge is not required, we can read it.  At least in the text, liability applies only to certain contents that do not meet their own statutory standard no matter how vague and we can argue that they do not meet that standard.  However, there is a scalability issue. There are too many contents taken without awareness of civil society. Techs and CSOs need to work together to agree on scalable community guidelines nuanced to the region that will affect many human moderators operating in the region. In Indoensia, there will be an election on February 2024. Because the length of the campaign is only limited to 75 days, many Internet users are using the Internet to search for information about the candidates and the political party. One platform that is pretty much used by Indonesian users the source information on the candidate is the platform named Wikipedia and recently Wikipedia received a backdown request from the government to take some of the content and also another request from the political figles who want to take down some information from his page. This is alarming that state censorship is not only about state censorship itself but it’s becoming a control of information and also a sign that people cannot freely access information anymore. With the request from the government any electronic service provider or simply called it a digital platform must obey to take down the content in less than 24 hours or in urgent situations they have to take down four hours. Wikipedia will not be the only platform that received requests from the government at the moment. Although CSOs lost during the challenge of the law, CSOs challenged the regulation to the administrative court system in Indonesia. At the moment we created a settlement with other partners. Indonesian CSOs need a more tech platform to come to join this coalition to make sure that we can protect the integrity of the election the next election in Indonesia. In the end civil society and also the pet platform can defend the digital rights of the people’s digital rights.
  4. UNESCO guideline: Many CSOs in the region united to oppose UNESCO guideline for platform regulation, because UNESCO condones the aforesaid trend of administrative censorship. UNESCO guideline specifically chose state regulation as the solution to the documented harms taking place on the internet. Why did UNESCO do that?  The UNESCO person yesterday says “because their CEO said so at certain event.”  That was all the explanation given.  All the governments of the world of course want more regulation.  They are held back only by the people that they serve.  CSOs representing the people need to act more strongly to oppose what will be NetzGD 2.0 in terms of its justifying effect on online administrative censorship in the region. UNESCO says “your authoritarian governments will be doing it anyway.” That is not the reason to condone and encourage them by presenting a guideline that can justify what they are doing. According to Nighat Dad, UNESCO is leaving behind a part of the world which will be adversely impacted by the guideline.
  5. Direct ramifications for democracy:  The third and more important maladaptation of NetzDG was that the government monopolizes giving the notices that trigger liability for failure to take down. Private individuals’ notices do not trigger the liability for failure to take down and therefore private individuals cannot fight back against the state-backed disinformation or hate speech campaign. This unequal playing ground is a fertile soil for digital authoritarianism. Indeed, after the ultra right and authoritarian governments learned how to use the internet, to be exact, post-Trump, we need to take down content, especially state actors or the ruling majority’s contents spreading disinformation and hate-ridden speech. This includes state-backed fake hashtags that trigger de-amplification or takedown of democratic content.
  6. Need for moderation as 2nd need for tech-CSO moderation: CSOs need to work with big techs to strengthen their ability to moderate anti-democratic content.  But there are hurdles: There is a lot of disinformation and harmful information that is still protected by international human rights.  International human rights also requires prohibition of hate speech but much of harmful speech does not rise to the level of hate speech.  Even a list of names, depending on who says to whom, can cause a lot of harms that are not visible within the 4 corners of the message.  Platforms are cornered by one side global organizations demanding human rights and by the other side local CSOs demanding moderation of state- or majority- actor content.  The idea that platforms have to be impartial in some standard is paralyzing their ability to moderate state-/majority-sponsored content harmful to democracy and human rights. No, platforms should be able to innovate in choosing their political leanings, e.g., pro-human-rights, pro-democracy. It is not the impartiality that has earned the intermediary liability safe harbor. CSOs need to challenge platforms to take pro-active . Authoritarian governments may respond by attacking the platforms for moderating state sponsored content such as Poland’s Freedom of Speech Council.  This means that we need stronger tech-CSO alliance.
  7. Best practices of tech-CSO alliance: In Pakistan, the 2016 cybercrime law allowing Pakistan to make social media rules (“citizen protection from online harms regulation” still not in force because barred in court) including takedown orders 24/48hrs/7days, techs and CSOs joined forces in pushing back on content moderation. In Indonesia, three national coalitions are formed for election, digital democracy, and content moderation in order to engage platforms. 10 organizations were identified as trusted partners but the number is not enough to cover the huge territories of Indonesia. Civic tech is another option to obtain connectivity. Platforms should share the new bills with civil society for analysis and action. Also, Southeast Asian Collaborative Policy Network was formed to engage more sharply with tech platforms. KS discussed the now defunct Twitter Trust and Safety Council marked by the following features: — Selection of consultants (about 40 orgs & 70 ppl, casting a wide net) — On-going periodic update on changes in community guidelines ——– Cycle: Changes – (experiment) – T&S consultation – public comment – (results analysis) – feedback – (start again?) ———–Crisis response hotline (3/27/20), experiment (5/5/20) shared only with T&S members———–Members listed publicly and updated regularly —–In-person annual conference where more private “products” can be shared.

*audience: about 40 in person + 40 remote. Out of 40 in person about one half female.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *