Right to be forgotten workshop at IGF 2016, Guadalajara

by | Dec 20, 2016 | Free Speech, Open Blog, Privacy | 0 comments

Day 3 (Dec, 8th) – WS28: The ‘Right to Be Forgotten’ and Privatized Adjudication

Session Title The ‘Right to Be Forgotten’ and Privatized Adjudication
Date December 8th.
Time 3pm to 4:30pm
Session
Organizer
Daphne Keller (Center for Internet and Society at Stanford) and Jeremy Malcolm (Electronic Frontier Foundation
Chair/Moderator Daphne Keller
Rapporteur/Notetaker Luiz Fernando Marrey Moncau
List
of Speakers and their institutional affiliations
Luiz Fernando Marrey Moncau, Intermediary Liability Fellow, The Center for Internet and Society at Stanford Law School

Daphe Keller, Director of Intermediary Liability, Stanford Law School Center for Internet and Society

Lina Ornelas, Head of Public Policy and Government Affairs for Mexico, Central America and The Caribbean, Google

KS Park, Co-founder of Open Net Korea

Christian Borggreen, Director of International Policy, CCIA

Jeremy Malcolm, EFF

Cedric Laurent, Executive Director, SonTusDatos

Key
Issues raised (1 sentence per issue):
The Right to be Forgotten is administrative censorship, not privatized adjudication.
There is great need to better define what we call “Right to be Forgotten”, as many different things are being treated as Right to be Forgotten.
If there is a right to be forgotten or delisted, it is possible to build a more balanced approach for intermediaries inspired in intermediary liability laws and principles (such as the Manila Principles).
If there were presentations during the session, please provide a 1-paragraph

summary for each Presentation

Daphne Keller provided a background to the European concept of the right to be forgotten or the right to be ‘de-listed’, and settled the terms of the discussion as focusing more on the procedures and measures platforms should follow for content removal, rather than on the debate around privacy/data protection x Freedom of Expression.
Kyung-Sin Park affirmed the idea of a Right to be Forgotten is problematic in the Asian continent, considering the historical background of many countries that faced authoritarian regimes. Park pointed out that the Right to be Forgotten can be an instrument for administrative censorship, and not exactly private censorship.
Lina Ornelas, mentioned that there are still a lot of undefinition around the right to be forgotten, making it hard for companies to make decision on what should be removed. Lina also emphasized the problems related to the lack of notification to the users or media that have results deindexed from Google, and called attention for jurisdiction issues brought by orders of global removals (such as the CNIL case) and the emergence of business like the Spanish Eliminalia.
Christian Borggreen affirmed that the “Right to Be Forgotten” is very challenging for companies and that different countries may have different approaches and are in a better position to define how to balance Freedom of Expression and other rights on what should be removed from the Internet.  He mentioned that the new regulation in Europe (GDPR), and that in Brussels the discussion has merged into a “Right to Erasure”. The GDPR moreover reaffirms the need for a balance between this Right to Erasure with the right of freedom of expression and freedom to obtain information.
Cedric Laurent mentioned Mexican have a data protection law and that some rights may be interpreted as a right to cancel or oppose processing personal data. He analyzed two cases that took place in Mexico. In the first one a person asked a site ABCtelefonos.com to delete peronsal information such as address and the Google Mexico. INAI (Mexican DPA) ruled wrongly, before the Costeja case, defining that Google Mexico could not be considered as a controller. In the second case, INAI affirmed jurisdiction over Google Mexico.
Luiz Fernando Marrey Moncau affirmed that  there is a great confusion about what is  the right to be forgotten in Brazil, mentioning that Brazil does not have a data protection law and most of the cases mentioned abroad target traditional media (such as broadcasters) instead of search engines. On the procedural level, Moncau mentioned that there is great concern about due process when there is no court involved. Moncau also emphasized the risks for journalism if the debate runs around the idea of truthful information, as this is not always possible to assert.
Jeremy Malcolm explained that holding an intermediary as a data controller was unexpected, because search a engine control of the data it indexes is minimal. Malcolm discussed the Right to Be Forgotten from the perspective of Intermediary Liability, pointing out that there is great uncertainty if the European E-commerce directive does not apply to search engines in the cases of Right to Be Forgotten (because they are considered as data controllers and not intermediaries), these companies can face huge penalties. Malcolm presented a few of the Manila Principles that would apply to the Right to be Forgotten cases, such as the that: i) the intermediaries should not be liable for failing to restrict lawful content; ii)  content must not be required to be restricted without an order from a judicial authority; iii) penalties should be proportionate; iv) intermediaries should not decide what is legal or not;  v) abusive or bad faith content removal requests should be penalized; vi) the person who posted the content should be heard; vii) the intermediary must be allowed to be and must be transparent about contents being removed and the reasons for that.
Please describe the Discussions that took place during the workshop session: (3 paragraphs) The discussion period opened with an observation from audience member Lorena Jaume about how there is no right to be forgotten in Europe, but a right to delisting and to restriction of data processing.  She thought that it was a mischaracterisation to talk about this as an intermediary liability issue, because the platform is not relevantly being treated as an intermediary under data protection law, but as a data controller.
But she found it curious that in the Facebook v Ireland case the ECJ found that Facebook Ireland was not a data controller, because the data processing was taking place in the United States. She asked the panel for comments on this contradiction.
Professor K S Park responded that he considered the ECJ to have wrongly found in the Costeja case that indexing data, although a transparent technical process, amounted to an act of data control.  Besides search engines, there are many other types of intermediaries that do not conduct any conscious processing of data.  These can be contrasted with organizations like hospitals and schools that control such data for their own internal purposes.  By contrast, when intermediaries such as Google automatically process personal information, the information itself is not meaningful to them.
The next discussant observed that the right to be forgotten gives people a second opportunity to correct the mistakes of the past.  But this gives platforms the power to decide what information is relevant and what is not relevant, and this is not an appropriate judgment for a private company to make.  This decision should be made by an official authority.  Professor Park questioned why would a government be better to make this decision.  Daphne Keller agreed that the word “relevant” is ambiguous, meaning from a technical perspective “what the user is

looking for” but also in the Google Spain case it has come to mean something very different.
Paula Vargas asked about transparency as a requirement of due process when in making decisions on content removal.  She explained that she has a project, with partners such as Derechos Digitales in Chile and elsewhere, which aims to map and define obstacles to greater platform transparency.  Luiz Moncau responded by suggesting that transparency requirements could be defined in law, for example by allowing transparency reporting and demanding administrative agencies to publish periodical reports.  Even if platforms are not classed as intermediaries in data protection law, we can draw on intermediary liability processes to build in transparency to right to be forgotten regimes.

Please describe any Participant suggestions regarding the way forward/ potential next steps /key takeaways: (3 paragraphs) Several important considerations concerning the way forward and potential next steps emerged from the Panel discussion and subsequent dialog with the larger group in the room.
One consideration concerns the importance of rooting any “Right to Be Forgotten” or “Right to Be De-listed” laws in the existing human rights framework. Relevant aspects of that framework include not only rights to privacy/data protection and free expression/access to information, but also rights to due process. By rooting analysis and legal advocacy in existing human rights law, practice, and norms, we can draw in particular on work developed in the context of Intermediary Liability, including the Manila Principles and case law such as the Argentine Belen Rodriguez case, to identify parameters for public or private adjudication of “Right to Be Forgotten” requests.
A second take-away from the panel concerns the importance of very precise legal analysis and articulation of the underlying laws and potential remedies. In particular, the so-called “Right to Be Forgotten” of the EU concerns only limited de-listing of results in search engines — not the erasure of published content on other websites. Nonetheless in case law, legislative proposals, and public discussion the term has often been expanded to cover much more far-reaching erasure demands, including demands for online media sources to suppress reporting. Similarly, as we consider the proper public and accountable authority to engage on issues concerning privacy and online content erasure or de-listing, it is critical to consider the relative roles of courts as compared to administrative agencies. In regions where Intermediary Liability law is tied to detailed legislated frameworks, as in the EU, it is important to identify when considerations of human rights may shape the obligations of governments and intermediaries, regardless of specific details of legislation or disputes about its significance. Finally, it is equally important for intermediaries themselves to clearly identify when requests properly fall under a framework other than “Right to Be Forgotten” law, in order to react appropriately to communications that may state other valid claims. Overall, the term “Right to Be Forgotten” can disguise meaningful legal distinctions, meaning that it is critical for stakeholders to maintain precision in analyzing individual issues as they emerge.
A third take-away, enhanced through discussion with the larger group in the room, concerns the importance of transparency. The need for transparency from search engines or other intermediaries concerning removal operations is widely recognized and discussed. However, other aspects of transparency present real opportunities for advocacy and progress made by other government, academic, and civil society actors. Specifically, government actors themselves, including Data Protection Authorities processing “Right to Be Forgotten” requests, can and should provide public transparency about volume of requests, standards applied and trends observed. In addition, in order to support transparency by intermediaries concerning all varieties of content removal, legal clarity is imperative.  When intermediaries are uncertain whether sharing specific information about search requests is permissible under national law, disincentives to attempt transparency are strong. Other actors can help remedy this situation by working to clarify applicable law, not only about “Right to Be Forgotten” but about other areas such as copyright as well.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *