‘One Owns Data about Oneself’, Myth? Metaphor? Rule? : Implications for RTBF

by | Sep 16, 2015 | Free Speech, Open Blog | 0 comments

Kyung Sin (“K.S.”) Park

Korean Communication Commission is considering adopting a right-to-be-forgotten law in the wake of the ECJ decision Google Spain.  We should be careful here.  A tenet that “one owns data about him or her (and therefore should have control over that data)” sounds good but is not always sustainable and compatible with respect for others’ freedom of thoughts and expressions.  That “K.S. Park is a professor” is data about me that is known to many already.  That I, K.S. Park, can control circulation of such data will be impossible in a free society, especially after I have introduced myself as a professor to a countless number of people.  When and under what grounds can I control perfectly lawful data that resides in another’s head, that is non-defamatory and non-privacy-infringing?

The slogan that ‘one owns data about him or her’ originates from the concept of “data surveillance”, a term coined by Alan Westin in his 1967 Privacy and Freedom.  The idea is that, when a person discloses data about him/her to governments and companies, the processing of such data for purposes not contemplated by the data subject or the disclosure to agencies not thus contemplated can constitute surveillance in a sense that the data processor will learn more information about the data subject than he/she intended to give.  For instance, the data processors can make predictions about the data subject’s future behavior.  Of course, the term ‘surveillance’ usually means acquisition of data about another against his or her will, such as wiretapping, search and seizure, etc.  But even voluntary disclosures of data can, if the conditions of the disclosures are not adhered to, lead to revealing something about oneself against his or her will, and hence the term “data surveillance.”

Westin, in an effort to protect people from data surveillance, proposed giving all data subjects some sort of property right on the data about them, because making promises about how the data will be used is not sufficient:  these promises are hard to enforce and more importantly powerless individuals will have hard time bargaining those promises from the governments and companies.  By proposing a property right, as opposed to a contractual, now the data processor had affirmative duties to obtain consent from the data subjects whenever it takes the data, on the purpose and scope of data use and disclosure, just like someone borrowing a car from another will have affirmative duties to obtain consent from the car owner about its use.  Westin’s proposal has persuaded an increasing number of countries and people around the world and manifested itself in the form of data protection laws, and against the background of this success the property metaphor has hardened into ‘owning data about oneself.’  Indeed, data protection law is a very effective tool for protecting the right of a powerless individual who, in disclosing data to a mega data processor, does not have acumen to bargain or enforce the conditions of that disclosure.

However, since the concept of data ownership was concocted to compensate for the data subjects’ lack of bargaining power on the point of disclosure and thereby prevent unwanted subsequent use and disclosure of the data about them, it is very important that it is not mechanistically applied to all data but only to the data that has not been made available publicly.  Publicly available data has no point of disclosure that the concept of data ownership needs to intervene to strengthen the data subjects’ bargaining power.  The paradigmatic situation for that concept works like this: when a data subject has kept certain personal data of his within a zone of privacy and later transfers out of such zone to the governments and companies, the concept of data ownership kicks in to ensure that its subsequent use or disclosure does not depart from the data subject’s original will, with strong force that contractual law will not provide.

This means that the concept should not be applied to personal data that has been already published to the public on a voluntary basis without any condition.  That ‘K.S. Park is a professor’ will be exactly an example of such data.  In the same vein, the data lawfully compelled into disclosure (for instance, the publicly noticed data of a company owned by a data subject) will be included.  Such definition is consistent with a common sense that it is no surveillance to acquire data that everyone knows.

A closer look at the world’s data protection laws already reveals a thread of such philosophy in Australia[1], Canada[2], Singapore[3], India[4], and Belgium[5] which explicitly leave ‘publicly available data’ out of the purview of data protection laws.  2004 APEC Privacy Framework also states that a data subject’s right can be limited with respect to ‘publicly available data’.  In 2000, EU and the US entered into a safe harbor treaty[6] on application of the 1994 EU Data Protection Directive on U.S. data processors, which left out publicly available data.  Further upstream, the provision in the 1980 OECD Guideline excludes from application the data that has no risk of infringing on privacy.’

The ECJ decision Google Spain is problematic exactly for this reason.  The decision empowers Costeja Gonzalez to remove the location information of a notice of auction on his house from the search results page maintained by Google, the data processor.  That his auction notices are located on the website of a local newspaper Vanguardia was publicly available data.  There was no information of which Costeja Gonzalez was making disclosure, the conditions of which had to be enforced by a property right for the protection of his privacy because that information was already publicly available data.  The corollary of this position is that including publicly available webpages in the search results should not be the subject of a data protection law.

If data subjects can control even publicly available data about us, the data protection law aimed at preventing surveillance will instead cause not only censorship on our colleagues but also surveillance on them because we need to watch what data they are acquiring and should be able to intervene when we want.  We wanted to protect privacy through data protection law.  We should stop talking about data ownership and start talking about privacy.

[1] http://www.alrc.gov.au/publications/2.%20Privacy%20Regulation%20in%20Australia/state-and-territory-regulation-privacy

[2] Personal Information Protection and Electronic Documents Act (S.C. 2000, c. 5))  Article 7 http://laws.justice.gc.ca/eng/acts/P-8.6/page-3.html#h-6 ; regulation defining “publicly available information” http://laws.justice.gc.ca/eng/regulations/SOR-2001-7/page-1.html

[3] Graham Greenleaf, “Private sector uses of ‘public domain’ personal data in Asia: What’s public may still be private” (2014) 127 Privacy Laws & Business International Report, 13-15

[4] Greenleaf, Id.

[5] Loi du 8 decembre 1992, art. 3, § 2

[6] http://www.apec.org/Groups/Committee-on-Trade-and-Investment/~/media/Files/Groups/ECSG/05_ecsg_privacyframewk.ashx

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *