[Read more session reports and live updates from the 12th Internet Governance Forum]
In her opening statement, the moderator, Ms Jac sm Kee, Women’s Rights Programme lead for the Association for Progressive Communications, noted that 'we are living in the age of data' where our bodies are increasingly perceived and used as valuable political data. This causes three main issues we face today – invisibility, hyper-visibility, and bias. Traditionally datasets and algorithms are expected to be objective in an analysis, but that is not the case. When researching body data, most of it is privatised and leads to issues of algorithms and trade secrets'. She also noted that the discussion on the body as data should focus on the impact that the treatment of data has on human rights, particularly those of marginalised groups.
Ms Valentina Hvale Pellizzer, of the Association of Progressive Communications, introduced the EROTICS: Sex, rights and the internet study to show how this project used numbers and percentages in researching data, the body and the Internet. The study mapped sexual rights, how activists use the Internet, types of risks, content regulations, and censorship, and produced several findings. First, the majority of the respondents were from the Global South, aged between 18 and 39 years old. A total of 81% reported suffering some kind of discrimination online, resulting from social class, caste, or sexual orientation. Second, their use of the Internet is a space for dialogue, advocacy, and affirmation of their rights, making this public space a private space as well. The third finding is that 98% of users use social networks. This enables researchers to understand their surroundings and peer interaction better. Fourth, respondents feel that the power over their sexual expression is in the hands of governments, Internet service providers, and their peers. Last, once threatened, respondents tend to avoid conflict and to remove themselves from the online space. Pellizzer concluded that given this information, the question now is how to move in the direction of hyper-visibility and to expose biases, to produce spaces where rights are asserted.
Also referring to biases produced by data, Ms Bishakha Datta, Executive Director of Point of View, talked about the project focusing on India's Information Technology Act of 2000, specifically Section 67 that deals with obscenity. She pointed out that the National Crime Bureau collects the biggest dataset in India related to criminal records. Chapter 18 deals with cybercrime and is used by journalists as well. According to Datta, the problem arises when no one questions the validity and objectivity of the provided records. 'Data is political, no matter how it is generated', she noted. Unlike with other crimes, cybercrimes use newly created descriptions such as sexual freak, pervert, disgruntled employee, cracker, hacker, and so on. She stressed that these subjective assignments of motives make datasets less credible. The new categories are applied only for cybercrimes and Internet-related cases, she said. Datta invited more discussion on how data is theorised and whose point of view is presented in collecting and working with this data.
Ms Vidushi Marda of (Article 19 and the Centre for Internet and Society) reflected on three phases of surveillance in relation to gender. Surveillance begins with the architectural phase, marked by mechanisms of control, she noted. 'Being seen, and being able to see the one that sees you', Marda said, encourages the technology of discipline. Next is the infrastructure phase, when surveillance moves from distinct institutions to networks. As an example, she mentioned video cameras that track movement, which can then be added to a network. The third phase is conceptualisation, where people are invited to spy on others, via social media for peer-to-peer interaction. This is where gender comes in. In India, many women lack funds and can give their mobile data in exchange for obtaining credit. Men are given other possibilities to gain funds. Marda also referred to algorithms that often re-enforce discrimination and prejudice.
In conclusion, Marda reminded participants that the norms we desire as a society should be built into the technology we use and that we have to be aware of cultural approaches to surveillance. All speakers agreed that not all voices are represented online and that issues of bodies and data are issues of human rights as well.
By Jana Mišić