Datafication of Body considered – Berlin IGF 2019

by | Feb 12, 2020 | Open Blog, Privacy | 0 comments

*This is a summary of Kyung Sin Park’s thoughts and speech at the meeting of Dynamic Coalition on Gender and Internet Governance

  Dr. Anja Kovacs from the Internet Democracy Project presented on how the meaning of body is expanding and being reconfigured with respect to data, and the fluidity between the two in the current digital age.  Data has become extensions of body.  To summarize some points among many other great points:  The amount of data about you extracted from your portable devices has increased.  Bodies have been turned into data points.  Data has become more important than body themselves.  For instance, what your smartphone tells you about how you feel in the morning has become more important how you actually feel.  Rapes do not take place on bodies but through digital images (although the speaker talked only about revenge porno, I guess deep fakes are other ways of committing rapes in imaginations – KS)  Also, face recognition is threatening one’s identity when the government refuses to recognize real faces in front of them just because the computer does not refuse to them.  We need to push back datafication of bodies and we need to re-think the consent-based data protection scheme because, if your data is your body, then no amount of consent allows others to take it away from you, (just as the regulations on sale of body parts – KS)!

  I think that this is too apocalyptic a picture. For instance, we have done facial recognition in the past.  What has changed in the number of faces one person or company can recognize.

  What is more dangerous, the concept of data as extension of body has the danger of bringing back the idea of data ownership, the idea that people own data about themselves when data is actually relation between things and minds perceiving them. The mantra that one can own data about oneself is only a metaphor and data protection rule should not be enforced to their letters.

  The bigger issue is whether to include more people in the databases or not. to make the data-based AI more fair.  Do we want to make Amazon hiring software fairer by collecting more career women’s data?  If the police AI facial recognition is not recognizing racial minorities on street, do we want to collect more data on them?

This has a lot to do with CCTVs.  After the gang rape in Delhi, CCTVs have proliferated all over in the country, raising a concern.  However, CCTVs can be used for good purposes, and the data collected through CCTVs can be used to train AI better.

  The right to be forgotten and the overzealous intermediary liability (as in the case of WOMAD investigation in Korea) will further decrease and infringe on integrity of data that can be used by AI in making decisions.

  Also, the critique of consent regime suffers from both the risk of giving away personal data too much and the risk of strengthening the libertarian concept of data ownership. It is impossible to own data about you like you own your body.  Social relations (and socialism, and other communitarian efforts) requires more commons-based regulation and enabling of data.


Submit a Comment

Your email address will not be published. Required fields are marked *