Data Protection for Next Generation: Putting Children First | IGF 2023 WS #62

11 Oct 2023 08:30h - 10:00h UTC

Event report

Speakers and Moderators

Speakers:
  • Njemile Davis, Government, Western European and Others Group (WEOG)
  • Edmon Chung, Technical Community, Asia-Pacific Group
  • Sonia Livingstone, Civil Society, Western European and Others Group (WEOG)
  • Theodora Skeadas, Civil Society, Western European and Others Group (WEOG)
  • Emma Day, Civil Society, Western European and Others Group (WEOG)
Moderators:
  • Ananya Singh, Government, Asia-Pacific Group

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Audience

The discussion focuses on the necessity of age verification and data minimization in relation to children’s rights in the digital environment. It is argued that companies should not collect additional data solely for age verification purposes, and trust in companies to delete data after verification is considered crucial to protect children’s privacy.

Another important point raised in the discussion is the need for the early incorporation of children’s rights into legislation. The inclusion of children in decision-making processes and the consideration of their rights from the beginning stages of legislation are emphasized. This is contrasted with the last-minute incorporation of children’s rights seen in the GDPR.

The discussion also advocates for the active participation of children in shaping policies that affect their digital lives. Examples of child-led initiatives, such as Project Omna, are mentioned to illustrate the importance of including children’s perspectives in data governance. The argument is made that involving children in policy-making processes allows for better addressing their unique insights and needs.

The role of tech companies is also explored, with an argument that they should take child rights into consideration during their product design process. Collaborating with tech companies to develop age verification tools is suggested as a means of ensuring the protection of children’s rights.

Additionally, it is noted that children, often referred to as “Internet natives,” may have a better understanding of privacy protection due to growing up in the digital age. This challenges the assumption that children are unaware or unconcerned about their digital privacy.

The discussion concludes by highlighting the advocacy for education and the inclusion of children in legislative processes. Theodora Skeadas’s experience in advocacy is mentioned as an example. The aim is to educate lawmakers and involve children in decision-making processes to create legislation that better safeguards children’s rights in the digital environment.

Overall, this discussion underscores the importance of age verification, data minimization, the incorporation of children’s rights in legislation, the active participation of children in policy-making processes, and the consideration of child rights in tech product design. These measures are seen as vital for protecting and promoting children’s rights in the digital age.

Edmon Chung

The discussion revolves around various important topics related to internet development, youth engagement, and online safety. Dot Asia, which operates the .Asia top-level domain, plays a crucial role in these areas. In addition to managing this domain, Dot Asia uses the earnings generated from it to support internet development in Asia. Moreover, Dot Asia runs the NetMission program, which aims to engage young people in internet governance. These initiatives are viewed positively as they promote internet development and youth engagement in Asia.

Another significant development is the launch of the .Kids top-level domain in 2022. This domain is specifically designed to involve and protect children, based on the principles outlined in the Convention on the Rights of the Child. By prioritizing children’s rights and safety, the .Kids initiative aligns with the principles of the convention. This positive step highlights the importance of involving children in policy-making processes that affect them.

Cooperation among stakeholders is emphasized for ensuring online safety. Various forms of online abuses and domain name system (DNS) abuses exist, requiring collaborative measures to create a safer online environment. The .Kids top-level domain is seen as a valuable platform to support online safety initiatives. By creating a dedicated space for children, it can contribute to the development and implementation of effective online safety measures.

The discussion also focuses on privacy, particularly in relation to data collection and age verification. Privacy is not just about keeping data secure and confidential but also about questioning the need for collecting and storing data in the first place. The argument is made that data should be discarded after the age verification process to strike a balance between protecting children and safeguarding their privacy.

The use of pseudonymous credentials and pseudonymized data are suggested as appropriate approaches for age verification. These methods allow platforms to verify age without accessing or storing specific personal information, addressing privacy concerns while still ensuring compliance with age restrictions.

Additionally, it is highlighted that trusted anchors should delete raw data after verification, and regulation and audits are necessary for companies that hold data. The importance of building the capacity for child participation in internet governance is also emphasized. These factors contribute to creating a safer, more inclusive, and child-centric online environment.

In summary, the discussion focuses on various important aspects of internet development, youth engagement, and online safety. Dot Asia’s initiatives and the introduction of the .Kids top-level domain reflect positive steps toward promoting internet development and protecting children’s rights. The importance of stakeholder cooperation, privacy considerations, and child involvement in policy-making processes are also highlighted. By addressing these aspects, stakeholders can work together to create a safer and more inclusive online space for all.

Sonia Livingstone

The discussions revolved around the significance of safeguarding children’s right to privacy in the digital realm and its interlinkage with other child rights. It was emphasised that children’s privacy is essential as it directly influences their safety, dignity, and access to information. Sonia Livingstone, an expert in the field, played an instrumental role in the drafting group for general comment number 25, which specifies how the Convention on the Rights of the Child applies to digital matters.

Furthermore, it was noted that children themselves possess an understanding of and are actively involved in negotiating their digital identity and privacy. To understand their perspective, a workshop was conducted by Livingstone to gauge how children perceive their privacy and the conditions under which they would be willing to share information globally. It was found that children universally recognise the importance of privacy and view it as a matter that directly affects them.

The introduction of age-appropriate design codes, tailored to cater to a child’s age, was highlighted as an effective regulatory strategy to protect children’s privacy. These codes have been implemented in various international and national settings, ensuring privacy in accordance with the child’s developmental stage. Livingstone, alongside the Five Rights Foundation, spearheaded the Digital Futures Commission, which sought children’s views to propose a Child Rights by Design approach.

Addressing the identification of internet users who are children for the purpose of upholding their rights online was identified as another crucial aspect. Historically, attempts to respect children’s rights on the internet have failed because the age of the user was unknown. It was emphasised that a mechanism is needed to determine the age of each user in order to effectively establish who is a child.

Regarding the implementation of age verification, it was suggested that a new approach is needed, involving third-party intermediaries for age checks. These intermediaries should operate with transparency and accountability, ensuring accuracy and privacy. However, it was acknowledged that not all sites and content necessitate age checks, and a risk assessment should be conducted to determine the appropriateness of such checks. Only sites with age-inappropriate content for children should require age verification.

The role of big tech companies in relation to age assessment was also discussed. It was posited that these companies likely already possess the capability to accurately determine the age of their users, highlighting the potential for collaboration in ensuring child rights protection online.

Furthermore, the importance of companies adopting child rights impact assessments was stressed. Many companies already understand the importance of impact assessments in various contexts, and embedding youth participation in the assessment process is seen as crucial. Consideration should be given to the full range of children’s rights.

There were differing perspectives on child rights impact assessments, with some suggesting that they should be made mandatory for companies. It was argued that such assessments can bring about significant improvements in child rights protection when integrated into company processes.

The active involvement of children and young people in the development of data protection policies was also highlighted as a key recommendation. Their articulate and valid perspectives should be taken into account to ensure effective policy formulation.

Finally, the importance of adults advocating for the active participation of young people in meetings, events, and decision-making processes was emphasised. Adults should actively address the lack of youth representation and ensure that young people have a voice and influence in relevant discussions.

In conclusion, the discussions centred on the necessity of protecting children’s privacy in the digital environment and its alignment with other child rights. Various strategies, including age-appropriate design codes and third-party intermediaries for age verification, were proposed. The involvement of children, youth, and adults in policy development and decision-making processes was considered pivotal for effective protection of children’s rights online.

Emma Day

Civil society organizations play a crucial role in advocating for child-centred data protection. They can engage in advocacy related to law and policy, as well as litigation and regulatory requests. For example, Professor Sonia Livingstone’s work on the use of educational technology in schools and the launch of the UK’s Digital Futures Commission highlight the importance of civil society organizations advocating for proper governance of educational technology in relation to children’s data protection.

Litigation and making requests to regulators are another important avenue for civil society organizations to advance child-centred data protection. This is evident in cases such as Fair Play’s complaint about YouTube’s violation of the Children’s Online Privacy Protection Act, which resulted in Google and YouTube paying a significant fine. These actions demonstrate the impact civil society organizations can have in holding tech companies accountable for their data protection practices.

Community-based human rights impact assessments are crucial for ensuring child-centred data protection. This involves consulting with companies, working with technical and legal experts, and including meaningful consultation with children. By involving children in the process, civil society organizations can better understand the implications of data processing and ensure that their rights and interests are taken into account.

Civil society organizations should also involve children in data governance. Involving children in activities such as data subject access requests can help them understand the implications of data processing and empower them to participate in decision-making processes. Additionally, auditing community-based processes involving artificial intelligence could involve older children, allowing them to contribute to ensuring ethical and responsible data practices.

Education about data processing and its impacts is crucial for meaningful child involvement. It is important for people, including children, to understand the implications of data governance for their rights. Practical activities, like writing to a company to request their data, can be incorporated into education to provide a hands-on understanding of the subject.

Civil society organizations need to collaborate with experts for effective child involvement. In complex assessments, a wide range of expertise is required, including academics, technologists, and legal experts. By collaborating with experts, civil society organizations can ensure that their efforts are based on sound knowledge and expertise.

Age verification should not be the default solution for protecting minors’ data. Other non-technical alternatives should be investigated and considered. Different jurisdictions have differing views on the compliance of age verification products with privacy laws, highlighting the need for careful consideration and evaluation of such solutions.

In efforts to protect children’s data, it is essential to centre the most vulnerable and marginalised children. Children are not a homogeneous group, and it is important to address the varying levels of vulnerability and inclusion across different geographies and demographics.

Designing products for the edge cases and risky scenarios is crucial for digital safety. Afsaneh Rigo’s work on inclusive design advocates for designing from the margins, as this benefits everyone. By considering the most difficult and risky scenarios, civil society organizations can ensure that digital products and platforms are safe and accessible for all.

In conclusion, civil society organizations have a vital role to play in championing child-centred data protection. Through advocacy, litigation, regulatory requests, human rights impact assessments, involvement in data governance, education, collaboration with experts, exploring non-technical alternatives to age verification, considering the needs of the most vulnerable children, and designing for edge cases, these organizations can contribute to a safer and more inclusive digital landscape for children.

Theodora Skeadas

The discussion revolves around several key issues related to children’s data protection and legislation. One focal point is the importance of understanding international children’s rights principles, standards, and conventions. The UN Convention on the Rights of the Child features prominently as a widely ratified international human rights treaty that enshrines the fundamental rights of all children under the age of 18, serving as a foundational document in safeguarding children’s rights.

Another significant aspect highlighted is the need for appropriate data collection, processing, storage, security, access, and erasure. It is emphasized that organizations should only collect data for legitimate purposes and with the consent of parents and guardians. Moreover, these organizations should use children’s data in a way that is consistent with their best interest. Implementing adequate security measures to protect children’s data is also underscored as crucial.

Consent, transparency, data minimization, data security, and profiling are identified as major issues surrounding personal data collection, processing, and profiling. It is mentioned that children may not fully understand what it means to consent to the collection and use of their personal data. Additionally, organizations may not be transparent about how they collect, use, and share children’s personal data, making it difficult for parents to make informed decisions. The over-collection of personal data by organizations is also highlighted as a concern.

The need for strengthening legal protection, improving transparency and accountability, as well as designing privacy-enhancing technologies, is emphasized as ways to address the issues related to children’s data. Governments can play a role in strengthening legal protections for children, such as requiring parental consent and prohibiting organizations from profiling children through targeted advertising. It is also mentioned that educating parents and children about the risks and benefits of sharing personal data online is crucial. Technologists are encouraged to design products and services that collect and use less personal data from children.

There is a global focus on legislation discussions that will impact child safety. Measures such as the Digital Services Act and Digital Markets Act in the European Union, as well as the UK online safety bill, are mentioned as examples of legislation that will have an impact on child safety.

In the context of the United States, there is a gap in legislation related to assistive education technology (ed tech) in schools. Existing bills mostly focus on access, age verification, policies, and education, rather than addressing the usage of assistive technology.

There is also concern about the challenges faced in passing comprehensive legislation related to children’s data, particularly due to competing interests and a divided political landscape. It is acknowledged that despite the proliferation of data and data-related issues concerning children, passing effective legislation proves difficult.

The dataset analysis also reveals the need to educate legislators about the rights and principles of children. Often, legislators may not be adequately informed about the rights of children and the specific meaning of rights like privacy and freedom of expression in the context of children.

The importance of including children in decision-making processes is emphasized as it makes legislation child-centric and serves the intended purpose well. Inclusion of children in the legislative process ensures that their voices and perspectives are heard and considered.

The analysis also highlights the necessity of considering the needs of children from diverse backgrounds. It is crucial to acknowledge and address the unique challenges and requirements of children from different social, cultural, and economic circumstances.

Furthermore, the inclusion of children as active participants in conversations about their well-being is stressed. This can be done through their participation in surveys, focus groups, workshops, and empowering them to advocate for themselves in the legislative process.

There is a suggestion for children to be represented on company advisory boards, emphasizing the importance of their inclusion and representation in corporate governance.

In conclusion, the discussion delves into various aspects of children’s data protection and legislation, shedding light on key issues and suggestions for addressing them. It emphasizes the significance of understanding international children’s rights principles, implementing appropriate data collection and processing practices, ensuring transparency, accountability, and consent, and designing privacy-enhancing technologies. Additionally, it highlights the importance of including children in decision-making processes, considering their diverse needs, and strengthening legal protection. However, there is recognition of the challenges posed by political division and the difficulties in passing comprehensive legislation.

Njemile Davis-Michael

During the discussion, various topics relating to data governance and the impact of digital technology on protecting children’s rights and promoting their well-being were covered. One significant highlight was the influence of the United States Agency for International Development (USAID) in technological innovation, as well as its efforts in humanitarian relief and international development. With 9,000 colleagues spanning 100 countries, USAID plays a significant role in funding initiatives to improve digital literacy, promote data literacy, enhance cybersecurity, bridge the gender digital divide, and protect children from digital harm.

Digital tools were identified as increasingly important for adults working to protect children. These tools, such as birth registration systems and case management support, help facilitate the protection and integration of children into broader social and cultural norms. However, it was acknowledged that increased digital access can also lead to increased risks, including cyberbullying, harassment, gender-based violence, hate speech, sexual abuse and exploitation, recruitment into trafficking, and radicalization to violence. The negative consequences of these risks were highlighted, such as limited exposure to new ideas, restricted perspectives, and impaired critical thinking skills due to data algorithms.

To address these risks, it was argued that better awareness, advocacy, and training for data privacy protection are crucial. The lack of informed decision-making about data privacy was identified as an issue that transfers power from the data subject to the data collector, with potentially long-lasting and harmful consequences. Recognizing the need for safer digital environments, data governance frameworks were presented as a solution to mitigate the risks of the digital world. These frameworks can create a safer, more inclusive, and more exciting future.

The importance of responsible and ethical computer science education for university students was emphasized. Collaboration between USAID and the Mozilla Foundation aims to provide such education in India and Kenya, with the goal of creating technology with more ethical social impacts. The integration of children’s rights in national data privacy laws was also advocated, highlighting the need for a legal framework that safeguards their privacy and well-being.

Empowering youth advocates for data governance and digital rights was seen as a positive step forward, with projects like Project Omna, founded by Omar, a youth advocate for children’s digital rights, gaining support and recognition. The suggestion to utilize youth networks and platforms to inspire solutions further highlighted the importance of involving young voices in shaping data governance and digital rights agendas.

The tension between the decision-making authority of adults and the understanding of children’s best interests was acknowledged. It was argued that amplifying children’s voices in the digital society and discussing digital and data rights in formal education institutions is necessary to bridge this gap and ensure the protection of children’s rights.

Notably, the need for a children’s Internet Governance Forum (IGF) was highlighted, recognizing children as stakeholders in internet governance. It was agreed that raising awareness and capacity building are essential in bringing about positive changes for children within this sphere.

In conclusion, the discussion shed light on the crucial role of data governance and digital technology in safeguarding children’s rights. It emphasized the importance of responsible technological innovation, data privacy protection, and the inclusion of children’s voices in decision-making processes. By addressing these issues, society can create a safer and more inclusive digital world for children, where their rights are protected, and their well-being is prioritized.

Moderator

The discussion on children’s privacy rights in the digital environment emphasised the importance of protecting children from data exploitation by companies. One argument raised was the need for regulatory and educational strategies to safeguard children’s privacy. Age-appropriate design codes were highlighted as a valuable mechanism for respecting and protecting children’s privacy, considering their age and understanding the link between privacy and other rights. Professor Sonia Livingstone, who was part of the drafting group for general comment number 25, stressed the need for a comprehensive approach that ensures children’s privacy rights are incorporated into the design of digital products and services.

The .Kids initiative was discussed as an example of efforts to promote child safety online. This initiative, which focuses on children’s rights and welfare, enforces specific guidelines based on the Convention on the Rights of the Child. It also provides a platform for reporting abuse and restricted content. Edmon Chung, in his presentation on the .Kids initiative, highlighted the importance of protecting children’s safety online and addressed the issue of companies exploiting children’s data.

USAID’s involvement in digital innovation and international development was also mentioned. The organisation works with colleagues in various countries and supports initiatives related to digital innovation. Their first digital strategy, launched in 2020, aims to promote technological innovation and the development of inclusive and secure digital ecosystems. USAID is committed to protecting children’s data through initiatives such as promoting awareness, aligning teaching methods with EdTech tools, and working on data governance interventions in the public education sector.

The discussion also brought attention to the risks children face in the digital environment, including online violence, exploitation, and lack of informed decision-making regarding data privacy. It was emphasised that digital tools play a significant role in protecting children and aiding in areas such as birth registration, family tracing, case management, and data analysis. However, the risks associated with digital tools must also be addressed.

Civil society organisations were recognised for their crucial role in advocating for child-centered data protection. They engage in advocacy related to law and policy, and their efforts have resulted in updated guidance on children’s privacy in educational settings and the investigation of violations of children’s privacy laws. The importance of involving children in data governance and policy development was highlighted, along with the need for meaningful consultation and education.

The discussion underscored the need for age verification mechanisms and risk assessments to ensure the protection of children online. The development of age verification products that comply with privacy laws was seen as a vital step. Concerns were raised regarding the lack of transparency and oversight in current age assessment methods. It was suggested that products should be designed for difficult and risky scenarios to benefit all users.

Overall, the insights from the discussion highlighted the importance of protecting children’s privacy in the digital environment and called for action to create a safer and more inclusive online space for children.

Session transcript

Moderator:
Finally, scan the Mentimeter QR code, which will be available on the screen shortly, or use the link in the chat box to express their expectations from the session. As a reminder, I would like to request all the speakers and the audience who may ask questions during the Q&A round to please speak clearly and at a reasonable pace. I would also like to request everyone participating to maintain a respectful and inclusive environment in the room and in the chat. For those who wish to ask questions during their Q&A round, please raise your hand. Once I call upon you, you may use the standing microphones available in the room. And while you do that, please state your name and the country you are from before asking the question. Additionally, please make sure that you mute all the other devices when you are speaking so as to avoid any audio disruptions. If you are participating online and have any questions or comments and would like the moderator to read out your question or comment, please type it in the Zoom chat box. When posting, please start and end your sentence with a question mark to indicate that it is a question or use a full stop to clearly indicate that it is a comment. Thank you. Let us now begin the session. Ladies and gentlemen, thank you very much again for joining today’s session. I am Ananya. I am the youth advisor to the USAID Digital Youth Council and I will be the on-site moderator for today’s session. Mariam from Gambia will be the online moderator and Nelly from Georgia will be the rapporteur for this session. Today, we embark on a journey that transcends the boundaries of traditional discourse and delves into the intricate realm of safeguarding children’s digital lives. In this age of boundless technological advancements, we find ourselves standing at a pivotal juncture where the collection and utilization of children’s data have reached unprecedented heights. From the moment there existed. becomes evident, their digital footprints begin to form, shaping their online identities even before they can comprehend the implications. Ultrasound images, baby cameras, social media accounts, search engine inquiries, the vast web of interconnected platforms weaves a tapestry of data silently capturing every heartbeat, every interaction. But amidst this digital tapestry lies a profound challenge, the protection of children’s data and their right to privacy. Children, due to their tender age and limited understanding, may not fully grasp the potential risks, consequences, and safeguards associated with the processing of their personal information. They are often left vulnerable, caught in the crossfire between their innocent exploration of the online world and the complex web of data-collecting institutions. Hence today, we are gathered here to delve deeper into the discourse on children’s online safety, moving beyond the usual topics of cyberbullying and internet addiction. Our focus will be on answering the following questions. How do we ensure that children in different age groups understand, value, and negotiate their digital self and privacy online? What capabilities or vulnerabilities affect their understanding of their digital data and digital rights? What is a good age verification mechanism so that such mechanism does not in itself end up collecting even more personal data? And finally, how can we involve children as active partners in the development of data governance policies and integrate their evolving capabilities, real-life experiences, and perceptions of the digital world to ensure greater intergenerational justice in laws, policies, strategies, and programs? We hope that this workshop will help the attendees unlearn the current trend of universal and often adult treatment of all uses, which fails to respect. children’s evolving capacity, often lumping them into overly broad categories. Attendees will be introduced to the ongoing debates on the digital age of consent. Panelists will also elaborate on children’s perception of their data self and the many types of children’s privacy online. Participants will also be given a flavor of the varying national and international conventions concerning the rights of children regarding their data. As our speakers come from a range of stakeholder groups, they will provide the attendees with a detailed idea on how a multi-stakeholder, intergenerational, child-centered, child-rights-based approach to data governance-related policies and regulations can be created. I invite you all to actively engage in the session, to listen to our esteemed panelists, and to ask questions, contribute your insights, and share perspectives. I would now like to introduce our speakers for today. To begin with, we have Professor Sonia Livingstone, who is a professor at the Department of Media and Communications at the London School of Economics. She has published about 20 books and advised the UN Committee on the Rights of the Child, OECD, ITU, and UNICEF on children’s safety, privacy, and rights in the digital environment. Next, we have Edmund Chung, who serves as the CEO of .Asia on the board of ICANN, Make a Difference, Engage Media, Exco of ISOC Hong Kong, and Secretariat of APR IGF. He has co-founded the Hong Kong Kids International Film Festival and participates extensively on internet governance matters. Next, we have Njimile Davis-Michael, who is a Senior Program Analyst in the Technology Division of USAID, where she helps to drive the agency’s development efforts related to internet affordability, data governance, and protecting children and youth from digital harms. Next, we have Emma Day, who is a human rights lawyer specializing in human rights and technology, and she is also the co-founder of the International Human Rights Council. founder of TechLegality. She has been working on human rights issues for more than 20 years now, and has lived for five years in Africa, and six years in Asia. And last but not the least, we have Fyodor Askeres, who is a technology policy expert. She consults with civil society organizations, including but not limited to, the Carnegie Endowment for International Peace, National Democratic Institute, Committee to Protect Journalists, and Partnerships on AI. I would now like to move to the next segment. I now invite our speakers to take the floor and convey their opening remarks to our audience. I now invite Professor Sonia Livingstone to please take the floor.

Sonia Livingstone:
Thank you very much for that introduction, and it’s wonderful to be part of this panel. So I want to talk about children’s right to privacy in the digital environment. And as with other colleagues here, I’ll take a child rights focus, recognizing holistically the full range of children’s rights in the Convention on the Rights of the Child, and then homing in on Article 16 on the importance of the protection of privacy. So I was privileged to be part of the drafting group for general comment number 25, which is how the Committee on the Rights of the Child specifies how the convention applies in relation to all things digital. And I do urge people to read the whole document. I’ve just here highlighted a few paragraphs about the importance of privacy and the importance of understanding and implementing children’s privacy often through data protection and through privacy by design as part of a recognition of the wider array of children’s rights. So to respect privacy must be proportionate, part of the best interests of the child, not undermine children’s other rights, but ensure their protection. And I really put these paragraphs up to show that we are addressing something complex in the offline world, and even more complex. I fear, in the digital world, where data protection mechanisms are often our main, but not only tool to protect children’s privacy in digital contexts. I’m an academic researcher, a social psychologist, and in my own work I spend a lot of time with children seeking to understand exactly how they understand their rights, their privacy, and we did an exercise as part of some research a couple of years ago that I wanted to introduce the types of privacy and the ways in which children, as well as we, could think about privacy. So, as you can see on the screen, we did a workshop where we asked children their thoughts on sharing different kinds of information with different kinds of sources, with different organisations. What would they share and under what conditions with their school, with trusted institutions like the doctor or a future employer? What would they share with their online peers and contacts? What would they share with companies, and what do they want to keep to themselves? And we used this as an exercise to show that children know quite a lot, they want to know even more, and they don’t think of their privacy only as a matter of their personal, their interpersonal privacy, but it is very important to them that the institutions and the companies also respect their privacy. And if I can summarise what they said in one sentence, the idea that companies would take their data and exploit their privacy, the children’s cry was, it’s none of their business. And the irony that we are dealing with here today is that it is precisely those companies’ business. We can see some similar kinds of statements from children now around the world in the consultation that was conducted to inform the UN Committee on the Rights of the Child at General Comet 25. And as you can see here, children around the world have plenty to say about their privacy and exactly understand it both as a fundamental right in itself and also as important for all their other rights. Privacy mediates safety. Privacy mediates dignity, privacy mediates the right to information and so forth, many more. I think we’re now in the terrain of looking for regulatory strategies as well as educational ones. And I was asked to mention, I think this panel will discuss the idea of age appropriate design codes, particularly as one really proven invaluable mechanism and we will talk further about this, I know. But the idea that children’s privacy should be respected and protected in a way that is appropriate to their age and that understands the link between privacy and children’s other rights, I think this is really important. And we see this regulatory move now happening in a number of different international and national contexts. I’ve spent the last few years working with the Five Rights Foundation as part of running the Digital Futures Commission. And I just wanted to kind of come back to that holistic point here. In the Digital Futures Commission, we ask children to comment and discuss all of their rights in digital contexts and not just as a research project, but as a consultation activity to really understand what children think and want to happen and what to be heard on a matter that affects them and privacy online is absolutely a matter that affects them. And we use this to come up with a proposal for child rights by design, which builds on initiatives for privacy by design, safety by design, security by design, but goes beyond to recognize the holistic nature of children’s rights. And so here we really pulled out 11 principles based on all the articles of the UN Convention on the Rights of the Child and on. And so you can see that privacy is a, is a right to be protected in the design of digital products and services as part of attention to children’s and the age appropriate service and building on consultation, supporting children’s best interest, promoting their safety well being development and agency, and I will stop there, and I look forward to the discussion. Thank you.

Moderator:
Thank you very much, Professor Livingstone that was very very insightful. We will now move to admin. Would you like to take the floor?

Edmon Chung:
Hello. Thank you. Thank you. Thank you for for having me and admin from Asia will be sharing, I guess, building on what Sonia just mentioned, we’ll be sharing a little bit about our work at kids, which actually also kind of trying to operationalize the convention on the rights of the child. But first of all, I just want to give a quick background why dot Asia is involved in this dot Asia ourselves is a obviously operates the dot Asia top level domain so you can have domains such as whatever dot Asia that provides the income source for us and so every dot Asia domain actually contributes to the internet development in Asia. One of the things you know some of the things that we do include youth engagement, and we actually are very proudly that very proud that to the net mission program is the. longest-standing youth internet governance engagement program. And that sort of built our interests or our awareness to supporting children’s and children’s rights online. Back in 2016, we actually launched a little program that looked at the impact of sustainable development goals and the internet. And we recently launched an eco-internet initiative, but I’m not going to talk about that. What I want to highlight is that engaging children on platforms, including domain, top-level domains, is something that I think is important. And one of the things that I would like to share. So on the specific topic of .Kids, actually the .Kids initiative started more than a decade ago in 2012 when the application for .Kids was put in through ICANN for the .Kids top-level domain. Right at that point, actually, there was a engagement with the children’s rights and children’s welfare community about the process itself, but I won’t go into details. What I did want to highlight is that part of the vision of .Kids is actually to engage children to be part of the process in developing policies that affect them and to involve children’s participation and so on. And in fact, in 2013, during the process where we were going through the ICANN process, we actually helped support the first children’s forum that is focused on the ICANN process itself, and that was held in April of 2013. Fast forward 10 years, we were finally able to put .kits into place in late, well actually last year, but the .kits top-level domain actually entered the internet on April 4th of 2022 and was launched late last year in November 29th of 2022. So it is less than a year old, so really not even a toddler for .kits. But let’s focus on the difference between .kits and for example .azure or .com. One of the interesting things is that at the ICANN level, there is no difference. For ICANN, operating a .kits would be exactly the same as operating .com. We disagreed and that’s why we engaged into the decade-long campaign to operate .kits and believe that there are policies that are required above and beyond just a regular registry, just a regular .com or … wherever, because there is not only a set of expectations, there are … it is important for … and here is why we say it’s the kids’ best interest domain. That is the idea behind .kits and let’s look at part of the registry policies. But for .kits ourselves, if you think about it, of course we don’t keep children’s data or data about kids, but does that mean we don’t have to have policies that actually is around the registry or for .kits domains itself? Well, we think no. And building off what … as Professor Livingstone was saying. In fact, we have a set of guiding principles that was developed with the support from the Children’s Rights and Welfare Community and based on the Convention of the Rights of Child. And of course, there’s additional kids-friendly guidelines, there’s kids’ anti-abuse policy, and also kids’ personal data protection policies. And I wanna highlight that the entire guiding principles is actually based on the Convention on the Rights of the Child, and probably not all the articles, but certainly articles that outlines protection and prohibited materials. A kind of way to think about it is probably that for the .kids domain, we do enforce and to ensure that restricted content, and the best way to think about it is really that if you think of a movie, the restricted content or the rated R movies would obviously not be acceptable on .kids domains. But on top of that, we also have specific privacy provisions also built on Article 16, as Sonia mentioned earlier, and some other aspects that is around the Conventions of the Rights of the Child. So we think there’s something that is important and is being built into it, and we’re definitely the first registry that builds policies around Convention on the Rights of the Child, but we are also one of the very few domain registries that would actually actively engage in suspension of domains or processes to deal with restrictions. restricted content. Beyond that, there’s a portal and a platform to report abuse and to alert us on issues. And in fact, I can report that we have already taken action on abusive content and restricted content and so on. But I will like to end with a few items. There are certainly a lot of abuses on the internet. But the abuses that is appropriate for top level domain registries to actually take is a subset of that. There are many other abuses that happen on the internet. And there are different types of DNS abuses and different types of cyber abuses that may or may not be effective for the registry to take care of. And that’s, I guess, part of what we discussed. That’s why we bring it to IGF and these type of forums to discuss, is because there are other stakeholders that need to help support a safer environment online for children. So with that, I guess there are a number of acts that are put in place in recent years. And I think .Kids is a good platform to support the kid safety online bill in the US and on the online safety bill in the UK. We do believe that collaboration is required in terms of security and privacy. And one of the vision, as I mentioned, for .Kids is to engage children in the process. And we hope that we will reach there soon. But it’s still in its toddler phase. So it doesn’t generate enough income for us to bring everyone here. But the vision itself is to put the policies and protection in place and also, into the future, be able to support children’s participation in this internet governance discussion that we have.

Moderator:
Okay. Thank you so much, Edwin. That was very, very inspiring. Let’s now go to Njimile.

Njemile Davis-Michael:
Thank you, Ananya. Wonderful to be here. Thank you so much for joining the session, giving me the opportunity to speak about USAID’s work in this area. So USAID is an independent agency of the United States government where I work with 9,000 colleagues in a hundred countries around the world to provide humanitarian relief and to fund international development. In the technology division where I sit, there are a number of initiatives that we support related to digital innovation from securing last-mile internet connectivity to catalyzing national models of citizen-facing digital government. And we work in close collaboration with our U.S. government colleagues in Washington to inform and provide technical assistance, to support locally-led partnerships, and to create the project ideas and infrastructure needed to sustain the responsible use of digital tools. Although we rely consistently on researching, developing, and sharing best practices, our activity design can be as varied as a specific country and community context in which we are called to action. Indeed, the many interconnected challenges that come with supporting the development of digital societies has challenged our own evolution as an agency. So in early 2020, we launched USAID’s first digital strategy to articulate our internal commitment to technological innovation, as well as for the support of open, inclusive, and secure digital ecosystems in the countries we serve through the responsible use of digital technology. strategy is a five-year plan that is implemented through a number of initiatives and there are some that are particularly relevant to our work with young people. Specifically, we have made commitments to improve digital literacy, to promote data literacy through better awareness, advocacy, and training for data privacy protection and national strategies for data governance, to improve cyber security, to close the gender digital divide and address the disproportionate harm women and girls face online, and to protect children and youth from digital harm. Each of these initiatives is supported by a team of dedicated professionals that allow us to think about how we work at the intersection of children and technology. Digital tools play an increasingly important role for adults working to protect children, for example, by facilitating birth registration, providing rapid family tracing, supporting case management, and by using better, faster analysis of the data collected to inform the effectiveness of these services. And they can also play a role in the development and integration of children themselves into larger social and cultural norms by providing a place to learn, play, share, explore, and test new ideas. Indeed, many children are learning how to use a digital device before they even learn how to walk. However, we also know that increased digital access also means increased risk. And so in the context of protecting children and youth from digital harm, USAID defines digital harm as any activity or behavior that takes place in the digital ecosystem and causes pain, trauma, damage, exploitation, or abuse directly or indirectly in either the digital or physical world, whether financial, physical, emotional, psychological, or sexual. For the estimated one in three Internet users who are children, these include risks that have migrated onto or off of digital platforms that enable bullying, harassment, technology-facilitated gender-based violence, hate speech, sexual abuse and exploitation, recruitment into trafficking, and radicalization to violence. Because digital platforms also generate and share copious amounts of data, our colleagues who’ve done an incredible amount of highly commendable work at UNICEF, for example, around children’s data, as well as my colleagues on today’s panel, will likely agree that there are other perhaps less obvious risks. For example, we’ve observed in recent years that children seem to have given up or into or in, I should say, to uniform consent of their data collection, probably due to their naivete and trust of the platforms in which they’re engaging. But a lack of informed decision-making about data privacy and protection effectively transfers power from the data subject to the data collector, and the consequences of this can be long-lasting. The number of social media likes, views, and shares are based on highly interactive levels of data sharing, affecting children’s emotional and mental health. Data algorithms can be leveraged to profile and manipulate children’s behavior, narrowing exposure to new ideas, limiting perspective, and even stunting critical thinking skills. Data leaks and privacy breaches that are not just harmful on their own but can be orchestrated to cause intentional damage is another risk. And we can counteract these and other challenges by helping practitioners understand the risks to children’s data and to ensure accountability for bad actors. The theoretical physicist Albert Einstein is famously quoted as saying that if he had one hour to solve a problem, he would spend 55 minutes. minutes thinking about the problem and only five minutes on the solution. And the sheer amount of data that we generate and have access to means that our vision of solving the global challenges we face with data is still very much possible, especially as we are realizing unprecedented speeds of data processing that are fueling innovations and generative AI will enable the use of 5G and that we will see in quantum computing. So as we celebrate the 50th birthday of the Internet at this year’s IGF, it’s amazing to think about how much all of us here have been changed by the technological innovations paved by the Internet and in that same spirit of innovation, we’re optimistic at USAID that data governance frameworks can help mitigate the risks we see today and be leveraged to create a safer, more inclusive, and even more exciting world of tomorrow, which is the Internet our children want.

Moderator:
Thank you very much, and Jameela, Emma, would you like to take the floor next? Emma, are you here with us?

Emma Day:
Thank you. Yes. Can you see my screen?

Moderator:
Yes. Please go ahead.

Emma Day:
Great. Thank you. Okay, so I’ve been asked to answer how civil society organizations can tackle the topic of child-centered data protection. I think this is a multi-stakeholder issue, and there are many things civil society organizations can do. As a lawyer, I’m going to have a focus on the more law and policy-focused ideas. So there are three main approaches that I have identified. The first is civil society organizations can engage in advocacy related to law and policy. Second, they can engage themselves in litigation. and requests to regulators, I should say. And third, they can carry out community-based human rights impact assessments themselves. So the first example of advocacy related to law and policy, here the target is policymakers and regulators. As an example of this, I was involved in a project that was led by Professor Sonia Livingstone, who’s also on this panel. And this was part of the UK Digital Futures Commission. And it was a project which involved a team of social scientists and lawyers. And we looked in detail at how the use of ed tech in schools is governed in the UK. And we found it’s not very clear whether the use of ed tech in schools was covered by the UK age appropriate design code or children’s code. So the situation of data protection for children in the education context was very uncertain. We had a couple of meetings with the ICO and the Digital Futures Commission also had a group of high level commissioners they had brought together from government, civil society, the education sector and the private sector. And they held two public meetings about the use of ed tech in UK schools. Subsequently, in May, 2023, the ICO published updated guidance on how the children’s code applies to the use of ed tech in UK schools. I won’t go into the details of that guidance now, but suffice to say this was much needed clarification. And it seemed to be as a result of our advocacy, although this was not specifically stated. The second example is a civil society organisations engaging themselves in litigation and requests to regulators. So some civil society organisations have lawyers as part of their staff, or they can work with lawyers and other experts. So an example of this is an organisation in the US called Fair Play. In 2018, they led… coalition asking the Federal Trade Commission to investigate YouTube for violating the Children’s Online Privacy Protection Act, or COPPA, by collecting personal information from children on the platform without parental consent. And as a result of their complaint, Google and YouTube were required to pay what was then a record $170 million fine in a settlement in 2019 with the Federal Trade Commission. So in response, rather than getting required parental permission before collecting personal information from children on YouTube, Google claimed instead it would comply with COPPA by limiting data collection and eliminating personalized advertising on their Made for Kids platform. So Fairplay wanted to check if YouTube had really eliminated personal advertising on their Made for Kids products, and they ran their own test by buying some personalized ads. Fairplay says that their test proves that ads on Made for Kids videos are in fact still personalized and not contextual, which is not supposed to be possible under COPPA. And Fairplay wrote to the Federal Trade Commission in August 2023 and made a complaint and asked them to investigate and to impose a fine of upwards of tens of billions of dollars. We don’t know the outcome of this yet, that complaint was only put in in August this year. And then the third solution, which I think is a really good one for civil society organizations, which I haven’t really seen done completely in practice yet, is to carry out community-based human rights impact assessments. So often companies themselves carry out human rights impact assessments, but it’s also absolutely something that can be done at a community level. And this involves considering not just data protection, but also children’s broader human rights as well. It’s a multidisciplinary effort, so it involves consulting with the company about the impact of their of their products and services on children’s rights, perhaps working with technical experts to test what’s actually happening with children’s data through apps and platforms, and working with legal experts to assess whether this complies with laws and regulations. And crucially, this should also involve meaningful consultation with children, and I think we’re gonna talk a little bit later about what meaningful consultation with children really looks like. I’m going to leave it there because I think I’m probably at the end of my time, looking forward to discussing further, thank you.

Moderator:
Thank you very much, Amma. And finally, Theodora, would you like to let us know what your opening remarks are?

Theodora Skeadas:
Yes, thank you so much. Hi, everybody. It’s great to be here with you. Let me just pull up my slides. Alrighty. Okay. Mm-hmm. And hold on one second. Let me just grab. Okay. Great. So, alrighty. So, it’s great to be here with all of you, and I’ll be spending a few minutes talking about key international children’s rights principles, standards, and conventions, as well as major issue areas around personal data collection, processing, and profiling, and then some regulation and legislation to be keeping an eye out for. So, I’ll start with standards and conventions and then turn to some principles. So, some of the major relevant standards and conventions that are worth discussing I’ve listed here, which include the UN Convention on the Rights of the Child, a widely ratified international human rights treaty, which enshrines the fundamental rights of all children under age. 18. It includes a number of provisions that are relevant to children’s data protection, such as the right to privacy, the right to the best interests of the child, and the right to freedom of expression. Also, the UN guidelines for the rights of the child as it relates to the digital environment in 2021. These guidelines provide guidance around how to apply the UNCRC or the rights of the child to children’s rights in the digital environment, and they include a number of provisions that are relevant to children’s data protection, like the right to privacy and confidentiality, the right to be informed about the collection and use of data, and the right to have data erased. Then there’s GDPR, or the General Data Protection Regulation. So this is a comprehensive data protection law that applies to all organizations that process data. For those in Europe, although sometimes this has been extended beyond specifically for companies or employers that are international and exist beyond the European area. These include a number of special provisions for children as well. Then COPA, the Children’s Online Privacy Protection Act in the US, is a federal law that protects the privacy of children under age 13 and requires websites and online services to obtain parental consent before collecting or using children’s personal information. So some of the principles that are important to discuss here include data collection, data use, data storage and security, data access, data and erasure, transparency, and accountability. So this means that organizations should only collect data for legitimate purposes and with the consent of parents and guardians. On data use, it’s that organizations should use children’s data in a way that is consistent with their best interest. On data storage and security, organizations should implement appropriate security measures to protect children. On data access and erasure, organizations should give children and their parents or guardians access to children’s personal information. children’s data and the right to have it erased. On transparency and accountability, organizations should be transparent about what they’re doing to make sure that they’re protecting children. Additionally, there’s age-appropriate design, privacy by default, data minimization, and parental control. Products and services should be designed with the best interests of children in mind, and also be appropriate for their age and developmental stage. On privacy by default, products and services should be developed with privacy in mind. On data minimization, products and services should only collect and use the minimum amount of data required. On parental controls, products and services should provide parents with meaningful control over their children’s online activities. Major issues around personal data collection, processing, and profiling that are in discussion today, include consent, so children may not fully understand what it means to consent to the collection and use of their personal data. That’s also true for adults, but it’s especially true for children. Transparency, so organizations may not be transparent about how they collect, use, and share children’s personal data, which can make it difficult for parents to make informed decisions about their children. Data minimization, so organizations often collect more personal data than is necessary for the specific purpose. This excess data can have other purposes like targeted ads profiling. On data security, organizations may not be implementing adequate security measures to protect the personal data of children from unauthorized access, disclosure modification, and destruction, which can put children at risk. Profiling, organizations may use children’s personal data to create profiles, which can be used to target children with advertising and content that might not be in their best interests. Additionally, strengthening legal protection. So there’s an ongoing conversation around how governments can strengthen legal protections for children, such as requiring parental consent and prohibiting organizations from profiling children through targeted advertising. Also raising awareness. There is a huge conversation ongoing now about how parents and children should be educated, but the risks and benefits of sharing personal data online to make sure they’re making informed decisions about what to share and what not to share. Also improve transparency and accountability. Organizations should be transparent about how they collect, use, and share children’s personal data, and they should be accountable for that data. And then last is designing privacy-enhancing technologies. Technologists can design products and services that collect and use less personal data from children, and also that help children and parents manage their privacy online. So next, we’ll look at regulation and legislation. We’ve been seeing a huge amount of regulation and legislation in this space. In the U.S. context, we’ve seen some U.S. federal bills, but because those haven’t passed, we’ve been seeing a transition to state-level bills. So I wanna pull up, there we go. So this is a piece that I wanted to share that talks about bills in the area that we’re seeing in the U.S. So there is here a compilation of 147 bills across the U.S. states. Not all are represented, but a lot of them are, and interestingly, states across the political divide. And you can see here the legislation that’s in discussion includes themes like age verification, more age verification, instruction, parental consent, data privacy, technologies, access issues, more age verification, so that’s clearly a recurring theme, recommendations on the basis. of data, et cetera. And you can see here, there are some categories. So we see law enforcement, parental consent, age verification, privacy, school-based, hardware, filtering, perpetrator, so that looks at safety, algorithmic regulation, and more. And then we can see the methods. So these include third parties, state digital IDs, commercial provider, government, IDs, self-attestation, and then you can see what ages these are targeting. So mostly they’re targeting age 18, but there are a few that look at 13, and sometimes other ages as well. And then the final categories of analysis look at access limited content or services, content types, and status. And I think that is it. Thank you so much.

Moderator:
Thank you very much, Fiora. I have received a request, actually, from the audience. If you could kindly share the link to the website that you were just sharing with us, that would be great. It was a very, very, very good remark. Thank you very much. Okay, so we will now be moving on to the next segment, where I will be directing questions to each of our speakers. We will begin with Professor Sonia Livingstone. While I had a set of questions prepared for you, Professor Livingstone, I think you kind of answered most of those. So let’s pick something from what you have focused on in your opening remarks. You mentioned about age-appropriate design code. So I wanted to know what are your views on this age-appropriate design code for different countries, since in different cultural, national, international, and local contexts, what is appropriate for what age differs? So what would you like to say about that, and how can an age-appropriate design code be the answer in such a context?

Sonia Livingstone:
And I think that’s a great question And I think that’s a great question, and I think others will want to pitch in. I think my starting point is to say that if we’re going to respect the rights of children online. We have to know which user is a child. And the history of the internet so far is a failed attempt to so far is a failed attempt to respect children’s rights without knowing which user is a child. So, we need a mechanism So, we need a mechanism that we can use to deal with the problem. And at the moment, we either have no idea who a user is or we somehow assume, or produces product producers or produces product producers somehow assume that the user is an adult, often in the global north, often male, and rather competent to deal with what they find. So we need a mechanism So we need a mechanism, and age that we can use to deal with the problem. And the extent to the problem. And the extent to which is being taken up in the global north and the global south shows the genuine need to identify who is a child. There are two problems, and one There are two problems, and one you didn’t highlight but it does mean that we need to, in some way, identify the age of every user in order to know which ones are children, because we need to identify the age of children, because we need to identify the age of children. So there’s a mechanism, and So there’s a mechanism, and which others have alluded to, and then, as you rightly say what is appropriate for children of different ages, varies in different cultures. I think I would answer that by returning to the UN Convention on the rights of the child, it addresses the child rights at the level of the universal, the infrastructure of that right so therefore, on the basis of the concept of gender equality and equality of civil rights and liberties to participate, be of children’s rights at the universal level. But there are also many provisions in the convention, and also in general comment 25 about how this can be and should be adjusted and tailored to particular circumstances, not to qualify or undermine children’s rights, but to use mechanisms that are appropriate to different cultures. And I think this will always be contested and probably should be. But at heart, if you read the age appropriate design codes, they focus on the ways in which data itself is used by companies in order to support children’s rights rather than setting a norm for what children’s lives should look like.

Moderator:
Thank you very much, Professor Livingstone. That was a very, very detailed and very nuanced answer. Next, Edmund, since we are on the subject of age, what do you think is a good age verification mechanism which does not in itself lead to the collection of more personal data?

Edmon Chung:
Of course, that is a very difficult question, but I guess a few principles to start with. First of all, privacy is not about keeping data secure and confidential. Privacy, the first question is whether the data should be collected and kept in the first place. In terms of privacy, if it is just an age verification, and whoever verifies it discards or erases or deletes the data after the verification, there should be no privacy concern. But of course, platforms and providers don’t usually do that, and that’s one of the problems. But the principle itself should be just like when you show your ID or whatever, The person takes a look at it, you go in, and that’s it. They don’t take a picture of it and keep a record of it. So that’s privacy to start with. The other thing, then, we need to probably think about whether the age verification is to keep children out or let children in. It’s a big difference in terms of how you would then deal with it. But especially on whether or not data should be kept or should be discarded. Now on the actual verification mechanism, I think, in fact, there is well-developed systems now to do what is called pseudonymous credentials. So basically, the platform or the provider doesn’t have to know the exact data, but can establish digital credentials with digital certificates and cryptographic technologies, techniques such that parents can vouch for the age and complete the verification without disclosing the child’s personal data. I think these are the mechanisms that are appropriate. And more importantly, I guess I go back to the main thing, is that if it is just for age verification, whatever data that was used should be discarded the moment the verification is done. Thank you very much.

Njemile Davis-Michael:
That was very comprehensive. Next, Njimili. How is the USAID thinking about data governance, especially with relation to children’s data governance? Yeah. We spend a lot of time thinking about data governance, and that’s because data really fuels the technology that we use at either. generates data in some way or uses data for its purpose. And technologies have a tendency to reinforce existing conditions. And so we wanna be really intentional about how data is used to that end. Data governance is important for a few basic reasons. One is because data by itself is not intelligent, so it’s not gonna govern itself. And because data multiplies when you divide. So there is so much of it, right? We know that the sheer amount of data, again, that we’re generating needs to be wrangled in some manner if we’re gonna have some control over the tools that we’re using. So data governance framework helps us think to think about what needs to be achieved with the data, who will make decisions about how the data is treated and how governance of the data will be implemented. Writ large, we look at five levels of data governance implementation, and that’s everything from transnational data governance down to individual protections and empowerment. And that’s really the sweet spot for us in thinking about children. It’s about developing awareness and agency, about participation in data ecosystems. Kind of in the middle is thinking about sectoral data governance, where we find that there are highly developed norms around data privacy, data for decision-making, data standardization, that help structure data-driven tools like digital portals for sharing data. And so we are currently working with Mozilla Foundation on a project similar to the one that we heard Emma talking about, where we are working in India and India’s public education sector to think about data governance. interventions there. India has one of the largest, if not the largest, school systems for children in the world. 150 million children are enrolled in about 1.5 million schools across the country. India had one of the largest, the longest, I’m sorry, periods of shutdown during COVID-19, and EdTech stepped into that gap very altruistically, right, to try to close gaps in student education. However, as, again, Emma has pointed out and we have found in our own research, there was some nuances in the ways that these EdTech institutions were thinking about student learning compared to the way schools were. And so, you know, private industry is incentivized by number of users and not necessarily learning outcomes. There needed to be some clarity around the types of standards that EdTech companies are to meet. There’s a reliance on EdTech replacing teachers’ interaction with students and data subjects generally lacking awareness about how their data is used by EdTech and schools to measure student progress and learning. So, we’re currently working with a number of working groups in India to really understand how to bridge this gap and to synchronize the collection of data and data analysis that harmonizes analog tools with digital tools. So, teachers who are taking attendance, how does that correlate to scores on EdTech platforms? And so, we’re, you know, focused right now on the education sector, but we imagine that this is going to have implications for other sectors as well. We’re also working, and I don’t know if I mentioned that, on this partnership. with Mozilla Foundation. We’re also working in partnership with Mozilla to look at responsible and ethical computer science for university students, also in India and in Kenya. And here, we’re hoping to educate the next generation of software developers to think more ethically about the social impacts of the technology they create, including generative AI. And then, going back to the protecting children and youth from digital harm that we’re doing, we are extremely proud to be working alongside and supporting youth advocates through our Digital Youth Council. We have Ananya, who participated in Cohort 1, and Mariam, who was, I believe, in the room a little bit earlier, helping to moderate the chat for today’s session, who are extraordinary examples of the type of talent that we’ve been able to attract and to learn from. In year 2 of the cohort, we received almost 2,700 applications worldwide. And from that number, we selected 12 council members. And we’re anticipating just as fabulous results from them. And so that’s generally how we are thinking about children’s data through our data governance frameworks. I think just riffing off of what I’ve heard today, we can also advocate through data governance for inclusion and enforcement of the rights of children into national data privacy laws, especially as we know, in IGF, lots of countries are thinking about how to develop those privacy laws. We should be advocating for the rights of children to be included. And in civil society, there’s opportunity to explore alternative approaches to data governance. Data cooperatives, which are community-driven, .can help groups think about how to leverage their data for their own benefit. Civil society perhaps has room to explore the concept of data intermediaries where they are a trusted third party that works on behalf of vulnerable groups like children to negotiate when their data is accessed and to also enforce sanctions when data is not used in the way that it was intended to.

Moderator:
Okay, thank you so much. And since Njimri has already approached on the conversation on bringing in the civil society, why don’t we move to Emma Day and ask her the next question. So Emma, how do you think civil society organizations could work with children to promote better data protection for children?

Emma Day:
Thanks so much for the question and yeah, I think Njimri came up with some really good starting points for this conversation already. I think to involve children, it has to really be meaningful and one of the difficulties not just with children, in fact, with consulting with communities in general on these topics of data governance is it’s very complex and it’s hard for people to understand immediately the implications of data processing for their range of rights, particularly projecting into the future and what those future impacts might be. So I think to begin with, to make that consultation meaningful, you have to do a certain amount of education and I think some of the great ways to do this is to involve children in things like data subject access requests where they can be involved in the process of writing to a company and requesting the data that that company is keeping on them so they can see in practice what’s happening with their data and form a view on what they think about that and for children to be involved in. these kinds of community data auditing processes or so there is some auditing of AI community-based processes that have been going on which I don’t think have involved children so far but obviously older children could get involved in these kinds of initiatives and I think involving children in conceptualizing how data intermediaries can work best for children of different ages is really important this is something we talked about a couple of years ago now I was one of the authors of the UNICEF manifesto on data governance for children and we had a few ideas in there about what civil society organizations can do to involve children I haven’t seen a lot of this happen in practice another one of the key things that that I would like to see is for civil society organizations to involve children in in holding companies accountable by auditing their products by doing these kinds of community-based human rights impact assessments and I think we need to think about not just the platforms and the apps but also some of the things like age verification tools like edtech products like health tech products tools that are used in the criminal justice system that are used in the social welfare system you really almost technology products impact almost all areas of children’s lives and we have to remember that all of these are private sector companies even where they’re providing solutions that are that are essentially to promote children’s rights we need to ensure that children are involved in in auditing those products and making sure that they really are having benefit for for children’s rights but I think to do that civil society organizations need to ensure that they involve academics they involve technologists and they involve legal experts to make sure that they they really get it right because these are complex assessments to make.

Moderator:
Thank you very much. Let’s move to Theodora. I know you mentioned about a lot of the existing international standards, conventions and laws regarding children’s rights and their data. What about the regulations and legislations which are underway to address some of these concerns? Are there any particular areas where these regulations could do better or any other suggestions that you might have for any such future conventions?

Theodora Skeadas:
Hi everyone. That’s a really great question.. Thanks Manya. So I’m gonna screen share again just so folks can see the database that I was referencing earlier. I think to me it’s not so much that there are specific technical gaps in what we’re seeing but rather and and of course this is a US focused conversation and it’s important to mention that there is legislation being discussed globally outside of the US as well and that legislation that’s happening elsewhere is inclusive of children’s safety issues. So for example in the European Union transparency related measures like the Digital Services Act and Digital Markets Act will have impacts on child safety and the new UK online safety bill which is underway will also impact child safety and and legislation discussions are happening elsewhere as well but within the US where this data set was collected and where my knowledge is strongest I think that it is pretty comprehensive although it’s it’s interesting to note that one of the questions that I saw in the chat touched on a theme that wasn’t discussed here in this legislation. So specifically the question was whether there was and I’m just looking through the chat again whether there was here we go Oh yeah, legislation related to assistive ed tech in schools. So I observed here that there are four school-based policies and two hardware-based policies, but none of them are focused on assistive ed tech. The ones that are focused on schools look more at like access, age verification, policies and education, and the hardware ones are focused more on filtering and technical access, or rather. So you can see those here, like requiring tablets and smartphone manufacturers to have filters that are enabled at activation and only bypassed or switched off with a password. So you can see that there is quite a range. I think to me, the bigger concern is whether this legislation will pass. We see a really divided political landscape. And even though we’re seeing a proliferation of data and data-related issues around children in legislative spaces, the concern is that there isn’t going to be a legislative majority for this legislation to pass. So it’s not per se that I see specific gaps and more that I have broader concerns about the viability of legislation and the quality of the legislation, because not all of it is equally as high quality. And so I think the increasing fraught political landscape that we find ourselves in is making it harder to pass good legislation, and there are competing interests at play as well. Thank you.

Moderator:
Thank you very much. I would now like to thank all our speakers for sharing their insights with our attendees. At the very same time, I would like to thank our attendees who I see are having a very lively chat in Zoom. Hence, since you have so many questions, why don’t we open the floor for questions from the audience? We would be taking questions from both onsite and online audience. If you’re onsite and if you have a question, you have two stand mics right there. You could kindly go to the microphones and please ask your question by stating your name and the country you’re from, and post that we will be taking questions from the chat.

Audience:
is Jutta Kroll. I’m from the German Digital Opportunities Foundation. They’re heading a project on children’s rights in the digital environment. First of all, let me state that I couldn’t agree more with what Sonja said in her last statement that if we don’t know the age of all users, age verification wouldn’t make sense. We need to know whether people are over a certain age, belong to a certain age group, or under a certain age. My question would be, we need to adhere to the principle of data minimization, so whether any of you have already thought how we can achieve that without creating a huge amount of additional data, and even the Digital Services Act doesn’t allow to collect additional data just to verify the age of a user. So it’s quite a difficult task, and Edwan has already said if we could trust companies when they do the age verification that they delete afterwards the data, but I’m not sure whether we can do so. So that would be my question, and the second point would also go to the last speaker, Theodora, that when you gave us a good overview on the legislation, the question would be how could we ensure that legislation that is underway takes into account from the beginning the rights of children, not like it was done in the GDPR on the very last minute, putting a reference to children’s rights into the legislation. Thank you for listening.

Moderator:
Thank you very much. Why don’t we deal with the first half of the question? Would any of the speakers like to take that? And we would then direct the second question to Federa. Yes, please go ahead.

Edmon Chung:
I’m happy to add to what I already said. I guess in terms of, in those cases, then it’s a pseudonymized data, right? I mean, so instead of collecting the actual data, there is a, it is very possible for a system, like platforms to implement pseudonymized credential systems. And those vouching for a participant’s age could be distributed, right? I mean, could be schools, could be parents, could be your workplace or whatever. But as long as it is a trusted data store that does the verification and then keeps a pseudonymized credential, then the platform should trust that pseudonymized credential. So I think that is the right way to go about it. The other part, as much as I still think it is the right way to ask for it to be deleted, can we trust companies? Probably not, but of course we can have regulation and audits and those kind of things. But for trusted anchors themselves also, whether it’s a school or whether it’s, whatever trusted anchor that the person. actually gives the age verification to, that institution should also delete the raw data and just keep the verification record, verified or not verified. And that’s the right way to do privacy in my mind. Thank you.

Moderator:
I think Professor Livingstone wants to add something. Please go ahead.

Sonia Livingstone:
If I may, yes. Actually, Edmund just said much of what I wanted to say, so I completely agree. And I’ve been part of a European effort, EU consent, which is also seeking to find a trusted third party intermediary that would do the age check, hold the token, and not have it held by the companies. So I think there are ways are being found. Clearly, the context of transparency and accountability and kind of third party oversight that scrutinizes those solutions will really need to be strong. And that also must be trusted. I’d add, I think we should start this process with a risk assessment because not all sites need age checks. Not all content is age inappropriate for children. So one would like to, I would advocate that we begin with the most risky content and with risk assessment so we don’t just roll out age verification excessively. And I’ll end by noting, big tech already age assesses us in various ways. I think the big companies already know the age of their users to a greater or lesser degree of accuracy. And we have no oversight and transparency of that. So I think the efforts being made are trying to write what is already happening and already happening poorly from the point of view of public oversight and children’s rights.

Moderator:
Thank you. Emma.

Emma Day:
I think this is a still a question that everyone’s grappling with really, and there are differing views maybe in different jurisdictions around how well age verification products comply with with privacy laws in different countries. I would really agree with what Sonia said about starting with a risk assessment, I think we need to look at first, what is the problem we’re trying to solve, and then is age verification the best solution, because to start with, if we’re going to process children’s data, it should be necessary and proportionate, and so we have to look at what other solutions there are that are not technical first, that might address the problem we’re trying to address first, rather than looking at just age verification across everything. I think also there’s an issue within, certainly under EU law, pseudo-anonymization is very difficult to say, but it’s also pseudo-anonymized data is still personal data under the GDPR, so it’s not that straightforward within the EU to just use pseudo-anonymized data as an alternative. So I think it’s still very tricky, and at European Union level, this is not something that has been settled yet either.

Moderator:
Okay, and Theodora, any remarks from you?

Theodora Skeadas:
Sure. Yeah, I think this is a really great question. It’s not easy to ensure that legislation takes into account the stated rights of children. I would start with education. I think, frankly, from my experience interacting with legislators, since I participate in the advocacy process, I found that most legislators are just under-informed, and so making sure that they understand what these rights and principles and standards actually are, what does it mean for the right to privacy to be manifest in legislation, or what are the best interests of a child. child? What is the right to freedom of expression? What do we think about the right to be informed when it comes to children? I think most legislators just don’t really know what those things mean. And so educating them, in particular, building coalitions of civil society actors and multi-stakeholder actors can be very effective in educating and influencing legislators around the rights of children. And then as was also mentioned in the chat, I think, Omar just put it in a few minutes ago, I believe including young people in decision-making processes is not just essential, it’s empowering. I think that’s an important part of the process too. Bringing together legislators, so the people who are actually writing legislation and the children themselves is really important so that the legislation process can be child-centric and really center the voices and experiences of the children that we’re trying to serve. And then last, I think it’s important to recognize that this needs to be done in an inclusive way and in a way that engages children from all different kinds of backgrounds so that all different experiences are included as legislation is happening. But again, I think education really is at the core here. Legislators want to hear from us and are excited when we raise our hands. Thank you.

Audience:
Thank you very much. We will now be taking questions from the online audience. May I request the online moderator to kindly read out any questions or comments that we may have received from the online audience? Hi. So we have two questions from the online participants and two comments. Question one is from Omar, who is a 17 year old. He asks, how can child-led initiatives be integrated into data governance, ensuring that children have a voice in shaping policies that directly impact their digital lives? He is the founder and president of Project Omna, which is an upcoming AI-powered mobile app that is focused on children. mental health and child rights and he wants to increase his impact in data governance for children. Second question is from Paul Roberts from the UK and he asked, when it comes to tech companies designing products and services, how common is it for them to be including child rights design in their process and at what stage? Proactive or afterthought for risk minimization? Comment one is also from Omar who said that he is from Bangladesh and is one of the 88 nominees for International Children’s Peace Prize 2023 for Advocacy Works. He is the founder and president of Project Omna and he’s also the youngest and only child panelist of every global digital compact session representing children globally and provided statements on everydata protection and cyber security for children. He suggested answer to the guiding questions that you started the session with is that one, children’s perspective are dynamic and he suggests the use of interactive story-based digital tools to help children grasp the importance of their digital data and rights, adapting these tools to different age levels. Two is that to collaborate with tech companies in order to develop age verification methods that employ user-created avatars or characters safeguarding personal data. Children’s feedback will be instrumental in refining this approach and three, establish child-led digital consoles or advisory groups for direct input into policy decisions. These groups should meet regularly, ensuring real-time feedback from children and aligning policies with their evolving needs and digital experiences. The final comment is from Ying Chu who says that maybe the younger generations know more about privacy protection and how to protect their data than educators or us. After all, the children were born in the internet age and they are internet kids. Many of us are internet immigrants. Oops, sorry, sorry.

Njemile Davis-Michael:
Okay, I’m going to go ahead and start with the first one, and I would love to see your application. One of the things that we try to do there is to raise the voice of youth advocates, not just to the level of international development organizations like USAID, but to also empower them to activate other youth networks. So, we have a platform that we use to encourage youth advocates to do that, and we try to do it in a way that is inclusive, that is awareness-raising, helps to inspire and incentivize solutions that we have not thought of yet. There’s this constant tension between adults who have authority to make decisions, and children who understand what’s best for them, but perhaps don’t have the agency to do such.

Moderator:
Okay, are there any other comments from the panellists? And since we are running short on time, I would otherwise like to move to the next segment. Okay, we see Professor Livingstone has some comments. I would request you to kindly keep it short.

Sonia Livingstone:
Yeah, it’s funny, I’m more familiar with 80 for 30 but I probably have an irritation about social altruism, rightly so. I think the challenge is for those who haven’t yet thought of it of haven’t yet embraced its values And so my answer to Omar and also to Paul Roberts would be to talk more, give more emphasis to child rights impact assessments. I think many companies understand the importance of impact assessment of all kinds and a child rights impact assessment requires and embeds youth participation as part of its process along with gathering evidence and considering the full range of children’s rights. But perhaps it’s more a mechanism in the language of companies and so one that if child rights impact assessment were embedded in their process, perhaps by requirement, I think that would make many improvements.

Moderator:
Thank you, Professor Livingstone. As we enter the final eight minutes of this very, very active and enlightening session, I’m very, very happy to invite our esteemed speakers to kindly share their invaluable recommendations in less than a minute, if possible. The question for all the panelists is how can we involve children as active partners in the development of data protection policies to ensure greater… Before I give the floor to our speakers, I would also like to strongly encourage the audience to seize this opportunity and share the recommendations by scanning the QR code, which is right now displayed on the screen or by accessing the link shared in the chat box. I would now like to welcome Professor Livingstone to kindly share her recommendation once again in less than a minute. Thank you.

Sonia Livingstone:
Well, I’ve mentioned child rights impact assessment and perhaps that is my really key recommendation. I think that what we see over and again in child and youth participation is that children’s and young people’s views are articulate, are significant, and are absolutely valid. The challenge really is also for us… who are adults. Every time we are in a room or a meeting or a process where we see no young people are involved, we must point it out. We must call on those who are organising the events, and that includes ourselves sometimes, to point out the obvious omission and to be ready to do the work to say these are the optimal mechanisms and here is a way to start, because people find it hard but youth participation is absolutely critical in this domain and is of course young people’s right.

Edmon Chung:
Thank you. Edmund? I will be very brief. I think a children’s IGF is called for and that’s the beginning of this wider awareness and I think it’s about building the capacity as well. I mean you can’t just throw children into a focus group for two hours and expect them to come up with a brilliant policy decision, right? So it’s a long-term thing, so it starts with actually the internet governance community and all these things that actually have children as part of a stakeholder group and that I think is probably a good way to go about it.

Njemile Davis-Michael:
Thank you. I agree with everything that I’ve heard and I would add that we need to do a better job discussing digital and data rights in formal education institutions. I think we can do a much better job of that globally, so that there’s a welcome, encouraging environment to hear children how they would like to advance their digital identities in a digital society. They have awareness, they have tools, and they have opportunities to do so in safe ways with mentorship and guidance.

Emma Day:
Emma? I will be very brief. I think a child’s IGF is called for Thank you, some great suggestions so far. I would like to just emphasise that children are not a homogenous group and I think it’s really important to centre the most vulnerable and marginalised children that can be within a country or it can be geographically, particularly considering global reach that a lot of apps and platforms have these days. There’s a particularly great scholar I would recommend reading up on Afsaneh Rigo’s work on design from the margins where she talks about how if products are designed for the edge cases, for the most difficult, most risky scenarios, in the end it’s going to benefit everyone much more. I’m going to share a link to that in the chat, thanks.

Moderator:
Thank you and finally Theodora.

Theodora Skeadas:
Yeah I think that this has been reiterated a few times but it’s worth mentioning it again. Really we need to be centring the voices of children as active participants in conversations about their well-being and so this can be done by including them in surveys, focus groups, workshops, various methods that are children friendly. Like I said I think in the legislative process I think that children should be empowered to advocate for themselves, specifically older children but children from all different backgrounds because this is their well-being at stake. I also think that when it comes to companies I would personally like to see children represented on these advisory boards. That hasn’t traditionally happened and I put a few of the advisory boards in the chat because these are ways to elevate the voices of children directly in conversation with the people making policies for the platforms.

Moderator:
Thank you. Thank you very much ladies and gentlemen. As we come to the end of this enlightening session I would like to express my heartfelt gratitude to our distinguished speakers for their unwavering commitment to sharing their knowledge and expertise and for also making our lives easier as moderators. because I see you have been responding to the comments and questions in the chat box. I would also like to extend my deepest appreciation to the very, very active audience for their extremely energetic engagement and thoughtful participation. Without your presence this session would not have been as meaningful and while we are on the subject of people who have been instrumental in making this session a success, I would like to thank my teammates, the very talented co-organizers of all the four workshops that we have hosted during the UN IGF 2023, Keo from Botswana and Nelly from Georgia. I cannot thank you both for your exemplary commitment, relentless hard work, awe-inspiring creativity and tireless efforts. In the absence of all of which we would not have been able to create the impact we have, I want everyone here in attendance to be aware and appreciative of the countless hours, late nights and personal sacrifices this team has made to keep this ship afloat. It was my good fortune indeed to have had the honor of leading this exceptional team, so thank you once again for making this happen. As we conclude this session, I urge all of us to kindly reflect on the insights we have gained and the recommendations put forth. Let us not let this be just another event or seminar, but rather a catalyst for action. It is up to each of us to take the lessons learned today and apply them in our respective fields, organizations and communities. Together we can create a better world for ourselves and future generations and we are right on time. Arigato gozaimasu, sayonara. Thank you.

Audience

Speech speed

155 words per minute

Speech length

710 words

Speech time

275 secs

Edmon Chung

Speech speed

142 words per minute

Speech length

1963 words

Speech time

828 secs

Emma Day

Speech speed

171 words per minute

Speech length

1779 words

Speech time

623 secs

Moderator

Speech speed

164 words per minute

Speech length

2378 words

Speech time

868 secs

Njemile Davis-Michael

Speech speed

154 words per minute

Speech length

2257 words

Speech time

880 secs

Sonia Livingstone

Speech speed

164 words per minute

Speech length

2124 words

Speech time

776 secs

Theodora Skeadas

Speech speed

162 words per minute

Speech length

2268 words

Speech time

841 secs