New Technologies and the UN: How Not to Reinvent the Wheel

by | Jun 14, 2021 | Free Speech, Open Blog, Privacy | 0 comments

KS Park spoke at May 27, 2021 at 13th Asia Human Rights Forum as follows:

 

Human Rights Council resolution 41/11 on new and emerging digital technologies and human rights created an advisory committee preparing a report on how human rights concerns can be addressed by the UN. 

 

Throughout the report, the term “new technologies” is used to refer to technological innovations that transform the boundaries between virtual, physical and biological spaces. They include new technologies and techniques for datafication (the process of transforming subjects, objects and practices into digital data), data distribution and automated decision-making, such as artificial intelligence, the Internet of things, blockchain technology and cloud computing, among others (para 3).

 

A common feature of new technologies is that they enable and accelerate the synchronization of offline and online spaces. A technical term for this process is the “physical-digital-physical loop”, which refers to the flow of data from the real world to the Internet and then back again into the real world (para 6).

New technologies have great potential to support the exercise of individual rights and freedoms. First, their augmented communicative power significantly expands users’ capacity to communicate and share ideas globally, contributing to the realization of human rights and fundamental freedoms. Second, new technologies can empower individuals by directly augmenting their capabilities in the real world (para 10).

While new technologies have great potential to contribute to the protection and promotion of human rights, they also pose significant challenges to human rights as follows:

Datafication resulting in a loss of privacy and the need to protect personal data 

Cybersecurity and integrity

Quality and authenticity of information

Radicalization, segregation and discrimination 

Disempowerment and inequality

Mass surveillance and overreaching Internet regulation

Cyberviolence

There are various efforts to address these concerns but the following operational gaps are hindering the efforts.

The first type of gaps is conceptual. Simply put, new technologies are creating a fundamentally different world that does not line up exactly with our traditional paradigms. Thus, it is essential to ask how human rights treaties, documents and practices could be better adapted to the digital age (para 49).

Another conceptual gap is that researchers and policymakers are disproportionately prioritizing some types of technological systems or focusing on their harms. This gap was mentioned by the European Union, which argued that some issues garnered attention while others were relatively neglected. For example, issues such as the impact of new technologies on freedom of expression, online hate speech and disinformation and privacy issues are well addressed compared to other problems like disempowerment and inequality, which are under-researched (para. 51).

Another operational gap is the growing disparity between the complex human rights issues created by the new technologies and the lack of adequate resources for the human rights mechanisms, which are being asked to do more with less.  Human rights defenders too need to keep pace with technological changes in their advocacy efforts (para. 53).

The private sector’s growing prominence and its role in protecting human rights is another crucial gap. Many new technologies, including artificial intelligence, virtual reality or blockchain technology, do not affect peoples’ lives in isolation but do so by functioning as integral parts of business models (para. 56).

A final issue that potentially complicates the protection of human rights is that new technologies are economically and strategically necessary. For example, artificial intelligence has clear military potential in pattern recognition and weapon targeting and is already used in various security applications. Likewise, the private sector estimates that artificial intelligence could generate between $77 billion and $3.9 trillion in revenue by 2023.69 This means that attempts to integrate the human rights approach into technological development may face pushback when security or profitability is threatened. It is possible that the competitive pressures can lead to disincentivizing businesses from human rights scrutiny on their business models (para. 57). 

As an effort to overcome these gaps, the advisory committee reports recommends a holistic approach centered around a newly established human rights impact assessment led by the UN. 

I am supposed to comment on the methodologies of human rights impact assessment addressing theses concerns. 

I first want to remind that technology assessment has been with us for more than half a century. 

 

Technology Assessment came into prominence with the US Office of Technology Assessment (OTA). Congressional hearings from 1969 led to public law #92-484 in 1972 establishing OTA, with its rst assessment produced in 1974. OTA’ss funding was stopped in 1995 by a Republican-led Congress. See Coates et al. (2001) for a succinct history of the evolution of the Future-oriented Technology Analyses (FTA) community, including notes about OTA. See http://www.princeton.edu/~ota/ (http://www.princeton.edu/~ota/) for an electronic archive of the hundreds of technology assessments generated by OTA plus good historical notes. Since the early 1990s, TA has become firmly established in Europe at various levels including at the EU level http://www.europarl.europa.eu/stoa/default_en.htm and in most member nations, for a listing see http://www.eptanetwork.org/EPTA/ (http://www.eptanetwork.org/EPTA/). 

Joe Coates (1971) offered an insightful illustration of the problematic nature of pinning down indirect effects. He nominally tracks the 1st through 6th order effects of introducing television on communities. The 1st order force of a riveting source of entertainment in the home leads to less mingling in local clubs; that leads to less neighborly interaction; isolation follows; then overdependence on spouses to meet one’s psychological needs; and finally, divorce rates escalate. 

Such technology assessment is being administered in Korea as well and I have participated in those assessments which include assessment from human rights angles as well. 

 

Therefore, in order to create a new institution of technology assessment without reinventing the wheel, we need to justify why we need a different set of technology assessment for new and emerging technologies. 

 

When we look at new and emerging technologies, they differ from other technologies exactly because they are more in line with promotion of human rights just as we saw in the role of the internet in the Jasmin revolution. New technologies may be so successful because they follow the grain of expanding human rights. Any human rights impact assessment on new technologies must take this into account and find ways to amplify the liberalizing and equalizing potentials of the new technologies. 

 

Also, the UN already began the annual Internet Governance Forum as a way to give various stakeholders (technical communities, human rights defenders, businesses, and governments) voices in addressing these concerns.  We should think about ways to incorporate the conversations there with the new human rights impact assessments.

 

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *