Children’s rights and participation in data governance
11 Nov 2020 17:50h - 18:50h
Increased digitalisation in children’s everyday lives, including home, play, and school, leads to amplification of risks and issues in regard to privacy online and digital rights. However, overprotective and limiting measures mean fewer children participate in the digital world. Yet, nonetheless, children’s human rights need protection and integration into the digital world.
Ms Velislava Hillman (Founder, Kratos Education) spoke about the pressing sense of urgency when regarding the involvement of digital technologies in almost everything that has to do with education. This urgency results in a lot of unnecessary data around children. It is important to slow down and think about what educational technology brings and its purposes. It is important to collectively take young people into account, who, at the end of the day, are the ones for whom educational technology has consequences.
Mr Couldry Nick (Professor, London School of Economics and Political Science) wondered if young people are really given a chance to contribute to discussions around digital technologies in education and their related rights. He believes this is not the case. Children’s freedom needs to be at the core of deciding how and which technologies are used for educational purposes. There are also many implications of education platforms on children’s freedom because these platforms are storing data concerning children starting at a young age. What does that mean for their freedom in the future?
The Recital 38 of the General Data Protection Regulation (GDPR) specifically mentions children. Children require special protection regarding their personal data, as they might not be aware of the risks, consequences, safeguards, and their rights concerning the processing of personal data. Ms Doaa Abu Elyounes (doctoral student, Harvard Law School, affiliate, Berkman Klein Center for Internet and Society, Harvard University) said that specific protection should, in particular, apply to the use of children’s personal data in cases of marketing or creating personality or user profiles and in cases of collection of personal data when services are offered directly to them by, for example, educational institutions. Children might not understand that sharing their e-mail address or hobbies might lead to market targeting. The GDPR clearly states that children need to be at the age of 16 (in some countries 13, but not less) to give consent, otherwise consent is given by parents. Otherwise, the GDPR treats children’s rights as identical to those of adults, when, for example, dealing with the right to object, the right to be given a copy of data if they ask for it, the right to be forgotten, the right for data deletion and consent for data portability (data to be taken to another platform), and the right not to be subjected to any automated process.
Ms Greene Gretchen (Founder, Greene Strategy and Analytics, AI policy advisor, affiliate researcher, MIT Media Lab and Harvard Berkman Klein Center) spoke about opportunities and possibilities of artificial intelligence (AI) in education. This can be used for identification if someone has permission to be in a hall or campus. It can determine if someone is likely to drop out of school, which gives opportunities for a timely intervention; and it is also used for adaptive and personalised learning. She said that some of her colleagues are using AI to track how early people engage with websites and in what ways. It also helps with understanding emotions and affect, while looking into facial expressions of kids to determine their engagement in a class. In addition, since many instructors do not like grading, AI offers possibilities of both automatic grading and feedback. Already widely spread throughout the world is automatic translation of language. Scaling education is the biggest hope for AI in education, said Gretchen.
Ms Beatriz Botero Arcila (doctoral candidate, Harvard Law School, affiliate, Berkman Klein Center for Internet and Society at Harvard University) invited expanding the existing protections concerning children to include companies that are also educational or are increasingly included in the educational system. This should also include web service providers, such as Google. Depending on the educational platform or application, several layers of users, such as children, teachers, institutions, are always present. ‘If we are using a grading app, for example, then the direct user is a teacher, but students’ data is running through it, which is also important for an educational institution.’ Because of layers of risks, the assessment of consequences and of transparency mechanisms are important to have.
Ms Ganesh Varunram (Research Engineer, Entrepreneur, MIT, Digital Currency Initiative (DCI)) says that while a general tendency to trust big companies exists, they are also open to hacks. Therefore, it is important to think about children’s privacy and their protection while online, including for educational purposes. It is important to bridge the gap between awareness and action. If a data privacy breach occurs, parents do not know how to proceed and to whom to report it (director, principal, police?). Many tools have been created in good faith amid the COVID-19 pandemic, but most did not have privacy in mind. Therefore, it is always important to balance utility and privacy. ‘Classrooms have to be a safe space without Big Brother.’
Towards the end of the season, participants were invited for comments or questions.
Mr Joao Pedro Martins (Youth IGF) said that YouthIGF members have produced specific messages linked to issues important to youth. From a youth perspective they often do not have opportunities to opt-in or opt-out when it comes to platforms used for educational purposes. National institutions decide on this and they are not asked.
Mr Emmanuel Chinomso (Senior Administrative Officer, Babcock University, Nigeria) said that legitimate fears arise concerning how AI learns about students and academic performance. Concern also raises questions of possibilities and occurrence. Different privacy policies are placed in different platforms and in different jurisdictions. It is often unclear how platforms protect the interests of users, and/or do they have consent from users.
Ms Ioanna Noula (Head of Research and Development, Internet Commission) said that alienation is the main reason for criticism of current education. In the new circumstances of the pandemic, a new form of alienation is emerging that has to do with the new edtech.
Ms Jutta Croll (Chairwoman of the Board and Project Manager, Stiftung Digitale Chancen) noted that the best protection we can provide to children is through more research. It is important especially to benefit from databases during the days of the COVID-19 pandemic when everything, and, in particular, children’s education is moved to a digital format.
Internet Governance Forum (IGF) 2020
9 Nov 2020 09:00h - 17 Nov 2020 19:00h