Personal sovereignty: Digital trust in the algorithmic age
10 Nov 2020 19:30h - 20:30h
There is an assumption that there is tension between privacy and data protection in the context of the COVID-19 pandemic. The panel first examined whether addressing and combating the pandemic requires people to location, biometric, medical data, and other very sensitive information which may result in human rights violations. The first question posed to panellists by moderator Ms Moira Patterson (Global Market Affairs & Community Engagement Director, IEEE) was how to address the need to balance citizens’ right to privacy and protecting public health.
Patterson noted that ensuring that the conditions of online access enable and preserve our personal agency and dignity in a way that empowers people is crucial. The levels of agency and trust which are assumed in the physical world can disappear in the online world, where data can be combined and analysed in new and potentially invasive ways.
Balancing commercialisation of data and data ownership
‘There is conflict between how much data should be given to address the pandemic and how much data should be protected because we are moving into a very dangerous area of surveillance’, stated Ms Salma Abbasi (Chairperson and CEO, eWorlwide Group). She noted that the following questions are crucial: how much surveillance is acceptable and how much privacy should be given away for public safety and health concerns.
Technology and surveillance systems are constructed by private companies, and users do not understand how companies are using this data. It is therefore important to build a technical task force that really understands how data is being collected and if the collected data is being used in an actionable and transparent manner. However, users also need to rethink how they are giving data to private companies. Users need to take ownership of their data: ‘We need to take back the responsibility rather than blindly saying, click, yes, take it. There needs to be ownership and recognition that our data is ours — we own our data,’ Abbasi stated. Companies, in turn, need to realise that citizens are the owners of their data.
Abbasi highlighted that in the long-term, governments will increasingly need more data collection to keep their citizens safe. Governments need to set policies to regain their citizens’ trust, where individuals are able to create their own Terms of Reference (ToR) to decide; who stores their data, who is allowed to share their data, under what circumstances their data will not be shared with third parties, and when the data will be permanently deleted. She highlighted the best practice by Estonia following the 2007 cyber-attacks.
Building personal algorithmic ToRs is possible, Mr John Havens (Executive Director, The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems [A/IS]) reiterated. He defined data ownership as owning the narrative and understanding particular technologies, tools, and policies. Being able to understand the algorithmic world and have tools to communicate with it is a great opportunity that one should embrace, rather than get angry about things we can’t change, Havens noted.
The critical role of standards in scaling solutions
Building on the relationship between humans and technology, Petterson noted that human dignity needs to be the core standard. Technology should serve people and people’s needs at a basic level. She agreed with Haven’s observation and pointed out that it is important to develop standards to help support this fundamental understanding.
Petterson also referred to a significant role that standards play in empowering people, such as digital literacy standards. To this end, the Institute of Electrical and Electronics Engineers (IEEE) recently approved the standard to help measure and create digital literacy frameworks, which will empower people with the necessary skills.
Ensuring privacy and trust in the age of algorithms
Personal information is navigated and analysed in a matter of nanoseconds. According to Abassi, through data mining and deep machine learning, our personal information is being exploited at new levels. While individuals need to rethink how they are going to give their data to a business, the private sector has to realise and accept that personal data belongs to the individual.
To this end, Abassi noted that governments need to increase their collaboration with think tanks and other organisations and agencies to increase the negotiation power of individuals, building a more balanced and ethical understanding of how their data is used, profited from, and manipulated. She referred to EU policies as good examples.
Companies should build processes to enable data owners (their customers), data sovereign channels for communication and problem solving, according to Havens. Users need to have an opportunity to trust by being given tools to act with agency and dignity..
Internet Governance Forum (IGF) 2020
9 Nov 2020 09:00h - 17 Nov 2020 19:00h