[Read more session reports and updates from the 14th Internet Governance Forum]
This session was dedicated to the Dynamic Coalition on Gender and Internet Governance (DC GIG) which works on integrating gender issues into the Internet governance space to bring the diverse perspectives of women, queer, and trans persons. The lead presenter, Ms Anja Kovacs (Internet Democracy Project) shared some research findings on body and data. Other discussants were invited to engage with their own perspectives as members of different DCs.
Surveillance is one of the most important issues from a gender perspective, particularly the structural harms of surveillance and the related debate on privacy as the solution to these harms. It is important to debate data governance from all aspects to find the right protections for human rights in the digital age.
Many pointed out the recent revelations of how menstruation apps collect, share, and use data. It has been said that the way data is connected to bodies today ‘is leading to a fundamental reconceptualisation of what bodies are’. Data has enormous power and it is often seen as ‘the truth’, because it is ‘a layer of reflection of reality’, Kovacs said.
The way data is connected to the body is leading to fundamental changes in the perception of what the body is, said Kovacs. However, there is little control of this data. Some groups are being more surveilled than others, which shows the disproportionate targeting of different demographics, such as in the case with Roma people, for instance.
Gender is not binary, it is a spectrum. Issues occur because data is deciding who people are. For instance, a person sees themselves as a woman, but airport security check machines and security personnel may see that same person as a man. This may lead to actions such as assigning such an individual a male controller for a security check. 'Data is defining our bodies and who we are,’ said Kovacs.
Sexual orientation is being defined based on faces. Mr KS Park (Open Net Korea) noted that people usually cover every part of their body except for their faces. If people owned their data, that data could not be used by anyone else without their consent. Mr Baldeep Grewal (Universität Würzburg, Germany) added that apart from faces, hands are also exposed, which is important when thinking about facial recognition and fingerprints. ‘Data has its own body too’, said Grewal. The right to be forgotten is important in this matter as well.
Ms Chenai Chair (Research ICT Africa) wondered if developers see body as a distraction for what they want to achieve for the good of society. It is important to work with developers and highlight potential biases, especially in facial recognition.
Privacy is not always clearly defined. It is important, as individuals, to retain the ability to have control over our own boundaries. ‘Gender identity is something we cannot develop and express freely, and we cannot be under the influence of outside voices,’ said Kovacs.
The audience agreed that security, innovation, and human digital rights narratives are all integral components of this topic. Policies and legal frameworks that balance these narratives are needed.
By Aida Mahmutović