The Right to Privacy in the Digital Age

Session: OF 32

14 Nov 2018 - 09:00 to 10:00

#IGF2018, #OF32

Report

[Read more session reports and live updates from the 13th Internet Governance Forum]

This session, moderated by Mr Tim Engelhardt, Human Rights Officer, OHCHR, featured discussions on the recent report of the UN High Commissioner for Human Rights on the right to privacy in the digital age. This report was presented in September 2018 to the UN Human Rights Council following consultations with stakeholders (including at an expert workshop and through numerous written submissions). It develops fundamental standards regarding the responsibility of businesses and the duties of states to protect against interference by third-party actors and to respect and promote the right to privacy. 

In the new report on the right to privacy in the digital age, the UN High Commissioner for Human Rights focused on recent trends of concern, such as the use of biometric data by governments and the private sector. Worldwide, many developments have demonstrated the risks generated by biometric data misuse and the need for better safeguards and mechanisms to secure and minimise the collection and processing of sensitive data.

Ms Emilie Serruga-Cau, Head of Public Affairs Department, Directorat for Compliance, Commission Nationale de l'Informatique et des Libertés (CNIL), presented the perspective of the French data protection authority with regard to biometrics, in the wake of the entry into force of the EU’s General Data Protection Regulation (GDPR). For Serruga-Cau, if the processing of biometric data can generate benefits for citizens, making identification procedures safe and convenient, it can also introduce new risks for privacy if no adequate safeguards are implemented. In the EU, the GDPR specifically addresses biometric data, as a special category of data, and lays down the prohibition of processing of this category of data as a general principle with very strict exemptions.

Ms Ololade Shyllon, Center for Human Rights, University of Pretoria, presented a recent case study from South Africa, illustrating how the use of biometrics, though initially well-intended, could result in the violation of human rights. Referring to the scandal around the company CPS, which had shared biometric information to its subsidiaries for increasing its profits, Shyllon emphasised the need for stronger data protection frameworks in Africa, in particular via the incorporation of the right to data protection and privacy in the African Charter.

Mr Graham Webster, Fellow and Coordinating Editor, DigiChina, New America, presented the emerging personal information protection regime in China, in particular with regard to biometric use. In Chinese law, a fairly significant, but untested regime of standards pertains to privacy and personal data protection. The cybersecurity law of China that went into effect last year has significant provisions on data protection, providing guidelines for personal information processing for private and public actors. Though non-binding, a personal information security specification has also been issued by cyberspace authorities. This document refers to biometric data as sensitive data needing specific data protection requirements, though providing significant exemptions for public safety authorities. Webster also mentioned that, as widely reported by both journalist and human rights organisations, police organisations are currently developing a national database of DNA for security purposes.

Mr Smitha Krishna Prasad, Centre for Communication Governance, National Law University, Delhi, gave a brief overview of the ID biometric Indian system and the challenges it raises. The Aadhaar program was intended in part to prevent corruption and ensure that only those entitled to receive subsidies could obtain them, using biometric identification. The project started in 2009, but an implementation law was passed only in 2016, without a comprehensive data protection law framework to ensure protection of sensitive data. The use of biometric information increased greatly in India as a result of this program, both for private and public services (bank accounts, healthcare, phone number, education), despite the fact that it was initially meant to be voluntary and limited in scope. In September 2018, the Supreme Court upheld this program, but placed new limits and restrictions on its use.

Mr Wafa Ben-Hassine, Access Now, Middle East and North Africa Policy Lead, first referred to a recent case of the use of biometric data during elections in Iraq. A project to integrate biometric card technology was launched in preparation for these elections, but no information has been provided on how sensitive data is now protected and stored. Ben-Hassine provided also a number of recommendations for better governance of sensitive data, mentioning that authorities should define a restricted scope for the collection of biometrics, and arguing that such programs should be voluntary and consent-based, and not a default security measure. Public authorities should design mechanisms to seek redress related to abuse and misuse, keep logs of access to sensitive data, and include all interested parties when designing new legislation.

 

By Clément Perarnaud

 

The GIP Digital Watch observatory is provided by

in partnership with

and members of the GIP Steering Committee



 

GIP Digital Watch is operated by

Scroll to Top