WS #134 Data governance for children: EdTech, NeuroTech and FinTech

16 Dec 2024 13:30h - 15:00h

WS #134 Data governance for children: EdTech, NeuroTech and FinTech

Session at a Glance

Summary

This discussion focused on data governance for children in the context of emerging technologies, specifically EdTech, FinTech, and NeuroTech. Experts explored the risks and benefits associated with processing children’s data in these domains, as well as governance models and regulatory frameworks.

The panel highlighted potential benefits of these technologies, such as personalized learning in EdTech and enhanced financial literacy through FinTech. However, they also emphasized risks like privacy concerns and potential exploitation of children’s data. The importance of multi-stakeholder governance models was stressed, with examples including regulatory sandboxes and public-private partnerships.

Participants discussed the challenges in implementing existing regulations and the need for better guidance for schools and teachers in choosing EdTech products. The conversation touched on the convergence of technologies and the difficulty in predicting future developments, particularly in NeuroTech.

The panel explored the global divide in both technology access and regulatory frameworks, emphasizing the need for a level playing field. They discussed the potential future implications of these technologies, including the possibility of cognitive enhancement and the integration of financial services into various digital platforms.

The discussion concluded by emphasizing the importance of maintaining a balance between innovation and protection in future regulatory approaches. Participants stressed the need for a holistic child rights approach when considering the future of technology and data governance for children.

Keypoints

Major discussion points:

– Risks and benefits of data processing in emerging technologies like edtech, fintech, and neurotech for children

– Multi-stakeholder governance models and regulatory approaches for children’s data protection

– Implementation challenges and gaps in existing legal/regulatory frameworks

– Future trends and concerns regarding these technologies and their impact on children

Overall purpose:

The goal of the discussion was to explore data governance issues related to emerging technologies that impact children, identify challenges and promising practices, and consider future implications and regulatory needs.

Tone:

The tone was primarily analytical and forward-looking, with speakers offering expert insights on complex issues. There was a sense of cautious optimism about potential benefits balanced with concern about risks. The tone became more speculative and urgent when discussing future trends and the need for proactive governance approaches.

Speakers

– Jasmina Byrne: Chief of Forsythian Policy at UNICEF

– Sabine Witting: Assistant professor for law and digital technologies at Leiden University, co-founder of TechLegality

– Emma Day: Co-founder of TechLegality

– Melvin Breton: From UNICEF

– Aki Enkenberg: From Government of Finland

– Steven Vosloo:

Additional speakers:

– Jutta Croll: From the Digital Opportunities Foundation in Germany

Full session report

Data Governance for Children in Emerging Technologies: A Comprehensive Overview

This discussion brought together experts from various fields to explore the complex landscape of data governance for children in the context of emerging technologies, specifically focusing on EdTech, FinTech, and NeuroTech. The panel, which included representatives from UNICEF, academia, and government, aimed to identify key challenges, opportunities, and future implications of these technologies for children’s rights and well-being.

Benefits and Risks of Emerging Technologies

The discussion began by acknowledging the potential benefits of these technologies for children. Emma Day highlighted the personalised learning opportunities offered by EdTech, such as adaptive learning platforms. Melvin Breton emphasised the role of FinTech in enhancing financial literacy from a young age, including through gamified savings apps. Aki Enkenberg noted the potential benefits of neurotechnology in health and education sectors, such as early detection of learning difficulties.

However, these opportunities were balanced against significant risks. Jasmina Byrne raised concerns about privacy and security risks associated with data collection, particularly the potential for data breaches in educational settings. Melvin Breton warned of the potential for manipulation and exploitation in FinTech, particularly given children’s vulnerability to persuasive design techniques. Aki Enkenberg cautioned about the risk of unconscious influencing through neurotech, especially as it moves from medical to consumer spaces.

Governance Models and Implementation Challenges

A key theme that emerged was the need for multi-stakeholder governance approaches to address the complex challenges posed by these technologies. Sabine Witting and Emma Day both emphasised this point, highlighting the importance of involving diverse stakeholders in shaping governance frameworks.

Emma Day and Melvin Breton discussed the value of regulatory sandboxes as a means of fostering innovation while ensuring compliance with regulations. Day explained that these sandboxes allow companies to test new products or services in a controlled environment, under the supervision of regulators, helping to identify potential risks and regulatory issues before full market deployment.

The discussion highlighted significant implementation challenges, particularly in EdTech. Emma Day noted that the main issue was not necessarily gaps in the regulatory framework, but rather difficulties in implementing existing regulations, particularly at the school level. This emphasized the importance of capacity building and support for educators and administrators.

The cross-border nature of many of these technologies was identified as a particular challenge by Emma Day, highlighting the need for international cooperation in governance approaches. Additionally, the panel discussed the digital divide and its implications for data governance in different parts of the world, recognizing that approaches may need to be tailored to different contexts.

Regulatory Frameworks and Gaps

While Emma Day emphasised implementation challenges, Steven Vosloo suggested that existing laws may not fully cover new technologies, particularly in the realm of neurotechnology. This highlighted a tension in approaches to regulation, with some speakers focusing on better implementation of existing frameworks and others calling for new regulatory approaches.

Steven Vosloo recommended that countries conduct policy mapping exercises to identify regulatory gaps, particularly for neurotechnology. This proactive approach was seen as crucial given the rapid pace of technological development and the move of neurotechnology from medical to consumer spaces.

Aki Enkenberg highlighted the challenge of regulating converging technologies that cross traditional regulatory boundaries. He also provided insights into Finland’s approach to data governance for children, which includes strong protections for children’s data and efforts to promote digital literacy.

Jasmina Byrne raised the issue of global fragmentation in regulation, emphasising the need for more uniform safety standards across different jurisdictions. Emma Day noted different approaches to enforcement, with some regulators taking a more collaborative approach while others favored punitive measures.

Future Developments and Challenges

Looking to the future, the panel identified several key trends and challenges. Aki Enkenberg and Melvin Breton both highlighted the ongoing convergence of different technology domains, with FinTech expanding into new areas such as gaming, the metaverse, and NFTs.

Steven Vosloo raised the possibility of a future divide between “treated, enhanced and natural humans” as a result of neurotechnology, highlighting potential equity issues that may arise from cognitive enhancement technologies.

Emma Day noted the geopolitical influences on EdTech development, highlighting the dominance of American and Chinese companies and European efforts to develop alternatives. This geopolitical dimension was seen as a crucial factor shaping the future landscape of educational technologies.

Throughout the discussion, Jasmina Byrne emphasised the need to shape technology development with child rights in mind, calling for a holistic child rights approach when considering the future of technology and data governance for children.

Conclusions and Future Directions

The discussion concluded by emphasising the importance of maintaining a balance between innovation and protection in future regulatory approaches. The panel stressed the need for adaptive governance models that can respond to rapidly evolving technologies while ensuring robust protections for children’s rights.

Key takeaways included the need for multi-stakeholder governance models, the importance of addressing implementation gaps in existing regulations, and the value of proactive approaches such as regulatory sandboxes and policy mapping exercises.

The panel identified several unresolved issues, including how to effectively regulate converging technologies, address global fragmentation in regulation, and incorporate child rights principles into technology development.

Emma Day mentioned UNICEF’s ongoing work on case studies about innovations in data governance for children, demonstrating continued efforts to address these complex challenges.

In conclusion, the discussion highlighted the critical importance of developing comprehensive, rights-based approaches to data governance for children in the context of emerging technologies. As these technologies continue to evolve and converge, ongoing dialogue and collaboration between diverse stakeholders will be crucial to ensuring that children can benefit from technological innovations while being protected from potential harms.

Session Transcript

Sabine Witting: EdTech, FinTech and Neurotech. My name is Sabine Witzing. I’m an assistant professor for law and digital technologies at Leiden University and the co-founder of TechLegality together with my colleague here, Emma Day. And we are joined today by a variety of speakers both online and offline. And I will ask the speakers to introduce themselves when I hand over to them. And I would really like to encourage participation both online and in the room here. Be critical, ask questions. We have brilliant people here who have possibly all the answers we’ll see about that. But otherwise they will ask you more questions. So I think it will be an interesting session. So let’s get started and let me hand over straight away to Jasmina. She is online for introductory remarks and setting the scene. Jasmina, over to you.

Jasmina Byrne: Hello everyone and good afternoon. I hope you had a productive day of sessions today. Sabine, shall I just… Yeah, I’m Jasmina Byrne, Chief of Forsythian Policy at UNICEF.

Sabine Witting: We can’t hear the online speaker. Oh. Jasmina, just hold on a second. Okay, we can hear you now, please proceed.

Jasmina Byrne: Oh, good afternoon, everyone. I’m Jasmina Byrne. I’m Chief of Forsythian Policy in UNICEF. Shall I hand over to colleagues or proceed with my…

Sabine Witting: No, please go ahead. We’re welcome in setting the scene.

Jasmina Byrne: Oh, okay, all right. Thank you so much, Sabine. Well, I hope you all had a productive day at IGF and I’m really sorry I’m not there in person. This is one of my favorite conferences, but you are in really good hands with Emma, Sabine, and Steve, my colleagues. And online, we have Melvin Breton, also from UNICEF, and Arki Enkenberg from Government of Finland, who is actually our key partner in the implementation of this initiative. And this session today is about rights-based data governance for children. across three emerging domains, education technologies, neurotechnology, and financial technology. So we have been working with about 40 experts around the world to understand better how these frontier technologies impact children, and particularly how data used through these technologies can benefit children, but also if it can cause any risks and harm to children. We all know that globally EdTech has been at the forefront of innovation in education. It can help with personalized learning. We see that the data sharing through education technologies can improve outcomes in education, facilitate teacher sessions, plans, administration, and so many other things. Other innovative technologies like Neurotech are currently being tried in diverse settings, and they offer great opportunities for improving children’s health and optimizing education. Financial technologies as well allow children to take part in digital economy through digital financial services. So all of these innovative technologies have also created data-related risks, particularly in relation to privacy, security, freedom of information, and freedom of expression. At the same time, we are seeing really rapid introduction. As we see a rapid introduction of these technologies into children’s lives, the policy debate is a little bit lagging behind. So this is why we hope that this initiative and the partnership with Government of Finland will not only help us identify what are the benefits and risks for children through use of these technologies and data sharing through these technologies, but also to help us. formulate policy recommendations for responsible stakeholders. And in this case, there are ministries of education, finance, consumer protection authorities, data protection authorities and others. So I’ll hand over to Sabine now to moderate the session and I hope we are going to have a productive discussion. Thank you all.

Sabine Witting: Thanks so much, Jasmina, also for laying out the kind of three blocks that we will be discussing in the session today. So we will first look at the risks and benefits associated with processing and collection of children’s data in these three domains. Then we will look at the governance models and lastly at the regulatory and policy frameworks. So let’s dive right into the first block. And as I’ve mentioned, we want this to be an interactive session. So after each block, we will have a Q&A session. So Emma, maybe I can start with you. As Jasmina was saying, there are lots of risks and benefits associated with data processing in the context of these emerging technologies. And maybe let’s zoom into the first domain into edtech, which I think is the most obvious one when you think about data governance and children. And maybe you can tell us a little bit about the examples that you have where edtech may be used or the data governance may be used for good in the context of children. Thank you.

Emma Day: Yeah, thanks so much, Sabine. So I think you’re probably aware that there’s currently a lot of debate about the benefits that can be derived from edtech in general, including first the pedagogical benefits, so the benefits for teaching and learning. So when we think about data processing, any data that’s collected from children must be both necessary and proportionate for this to be lawful under data protection law. So for edtech to be necessary, it must first serve an educational purpose. And there’s still much debate about to what extent edtech products do serve an educational purpose and where that purpose has been identified, then it’s also not yet really clear what benefits can be derived from the data that’s processed by edtech. For example, by sharing those data with the school, with the government to analyse for more evidence-based kind of policymaking. I think there’s still a lack of clarity around exactly what data would be helpful. What are the questions that we’re seeking to answer with these data? There’s much debate about the potential for personalised learning. And this relies on algorithms which learn from individual children’s data and steer them. their learning to suit their personal learning needs. And data from these kinds of tools can also potentially be shared with teachers. And then perhaps their teachers can identify early which of their students are falling behind, particularly if they have a very large class of students, they may miss a student, but if they have this, an algorithm can show them which students in their class are falling behind the rest of them. And it may also help them to look at equity to ensure that girls, children with disabilities, and children in rural areas are receiving the same opportunities as everyone else. And then finally, on this point, there’s some interesting projects looking at how children can have more agency, so they are actually benefiting themselves and they’re able to share their data for their own benefit in privacy preserving ways. So for example, in the UK, the ICO, which is the Information Commissioner’s Office, has just started a sandbox project with the Department of Education. And this is aiming to enable children to share their education data securely and easily with higher education providers once they reach the age of 16. So I will leave it there. I’m sure there are many other benefits and we’ll let the audience come in with more a little bit later.

Sabine Witting: Thanks so much, Emma, for laying out these benefits in the context of EdTech. And Melvin, if I can hand over to you and maybe you can tell us a little bit about the benefits and risks in relation to the FinTech sector. Melvin, over to you.

Melvin Breton: Thank you so much, Sabine. I think similarly to EdTech, you can really think about all these technologies that are enabling better data processing as sort of double-edged swords. You can think about, in the application with FinTech, ways in the most obvious way in which it benefits children. is in enhancing financial literacy from a young age, right? The better, the more data, the better data collection that you carry out as some of these technologies are being used by children, you can learn about their money habits and perhaps can have personalized nudges that alert them that they’re overspending in central categories that they need to save or nudge them towards developing healthy saving habits and healthy spending patterns as well, right? So the better the processing using emerging technologies, the better this kind of ongoing feedback and real-time feedback becomes and helps kids develop good money management skills. And you can also think about at the intersection FinTech and EdTech about using this data to develop purpose-built applications for education in financial literacy. So that’s on the positive side. There are other many applications. If you think about the intersection of public policy and FinTech, you have the commission of the rights of the child establishes the right to social security and social protection. And there’s a lot of applications of FinTech in handing social security and social protection benefits and cash transfers in different contexts that are enabled by FinTech. And the better the data processing technologies become, the more efficient and agile the social protection applications of financial technology can become. We’re looking at in different parts of the world. issues with the population and you’re also looking at future, the future of labor markets and people are talking about universal basic income. How about universal child benefits? Starting there and seeing how emerging technologies can enable us to make universal child benefits universal and much more efficient. So that’s on the benefits side, many more. On the risks, there’s always the risk of exploitation, as with any technology, more information means more opportunities for bad actors to target their attacks to children, promoting on the weak side, on the downside of better spending habits, you can also promote children and young people, not just children, to overuse some of these financial technologies, sometimes to their detriment. And we’ve seen some alarming cases with, for example, trading apps, stock trading apps, addressing mental health issues and harms to children or to young people, rather. And there’s also the potential for manipulation, for making children buy things that they don’t necessarily need, making it available for them to buy products and services that are harmful. And then there’s the whole issue of facilitating addictive behaviors and through in-app purchases and things like that. So we can get into either any of those more, but I’ll just leave it there for now. for the time being, over.

Sabine Witting: Thanks so much, Melvin, for that. I think that was really interesting to see also how a technology like FinTech that we maybe might not have thought about initially when you think about children’s data also has these risks and benefits. Thanks so much, Melvin, for laying these out. Aki, maybe you can share a few examples and your experience from Finland and this area around the risks and benefits across these frontier technologies. Aki, over to you.

Aki Enkenberg: Yes, absolutely. And I’m very happy to be here. Thanks, UNICEF, for inviting me to be part of the panel. It’s quite a timely issue that does require strong multistakeholder cooperation. And the IGA is a really good platform for taking these issues, debate around these issues further. And we also have to keep in mind, and this is a broader point, that the recently approved global data digital compact puts issues around data governance for the first time firmly on the global development agenda. And we should be also mindful of systematically including a child lens in these discussions going forward. But from the Finnish standpoint, looking at what we’ve done nationally, a couple of remarks with a specific focus on the education or education system. We’ve long recognized that children and youth do need to be considered through specific perspectives in relation to digital technologies, AI and data. This kind of perspective has been part of our kind of national thinking around AI policies, data policies, and we’ve also worked together with UNICEF on these issues, both on AI and data governance with important benefits for our national policymaking. The tradition has also been that we’ve had strong multistakeholder cooperation in place at the national level to be able to uncover evidence, make informed choices, take informed action, et cetera. in our context. So this realisation that children and youth are in the forefront from the point of view of evolving use of new technology is quite crucial, especially in relation to social media. They’re often early adopters of new services but also potentially less mindful of privacy concerns, they’re less informed about their data rights, perhaps care less about those rights, etc. And in national policymaking there’s often this tendency to really prioritise the potential and promotion of technology in national AI or data strategies, for example in education or health, but a lot less focus on safeguarding rights or child rights specifically. Children and youth are faced with quite complicated legal frameworks, insufficient understanding of their own rights, social pressures that make it difficult to opt out, etc. And of course when we’re talking about young children specifically, they’re not in a position to make these choices in the first place, so they have to rely on others to make them for them. But in terms of our measures, first I’ll bring up this priority of strengthening the agency of children and youth to kind of regard them as active agents in their own right when it comes to data governance, to support their capacity and competence to act. And this is also something we’ve considered quite important from the point of view of developing democratic citizenship also in Finland. So data and AI literacy as a first step has received special attention in our case. We’ve realised the need to update media literacy education for the data and AI age. There’s a number of research and development projects focusing on developing guidance and approaches for schools and teachers, etc. in this field. And the focus most often is on making sure that child rights are integrated in how schools adapt. and use tech or digital services in their daily operations. There are some flagship projects by several universities, also by Sitra, our national innovation fund, funded by the Academy of Finland, educational authorities, et cetera. For example, there’s a project called Gen-I, funded by the Council of Strategic Research, which focuses on exactly this evolving landscape of data and AI literacy and what it takes to be able to understand the implications on data governance as well. But secondly, besides this, there is this realization that this ongoing datafication of schools and educational settings call for improved standards and certifications for technology. Because when you look at what’s going on in the private sector, there’s an increased focus on measuring cognitive processes, emotional response of children, behavior of them, by them in different settings, where they learn and are being taught. And of course, the key benefit there is that by automizing learning analytics, teachers can then focus on student interaction and support individual learning better. But there is this tendency of growing and continuous data gathering, where neurotechnology is also increasingly part of the problem. It provides deeper insight into processing of information, learning by children, but also raises new questions around how that data is governed. So as a response, our Finnish National Agency for Education is preparing a comprehensive package of guidance at the moment, not only focusing on what children should learn and how they should learn in the digital age, but also what kinds of tools and services should be used by schools and teachers to ensure the quality and safety of digital content and services, and to engage in regular dialogue with the actors involved in producing these contents and services. And as I mentioned in the beginning, the belief really is that none of this can be done by the governments alone or our authorities alone, but through active cooperation with research community, edtech companies, schools and parents.

Sabine Witting: Thank you. Thanks so much, Aki, for this intervention, for sharing the experience from Finland. And you provided me with the perfect segue into the kind of second block of the conversation, which is around governance models. You said that none of the stakeholders can do it alone, and I think that holds true for a lot of the topics we’re discussing at IGF, but specifically for these new forms of data governance. And you also mentioned the Global Digital Compact and how the Global Digital Compact is also encouraging this multi-stakeholder governance model. So maybe we can think a little bit about what data governance could look like for these three domains. And of course, when we think about data governance, we first think about the DPAs, the data protection authorities. But of course, this topic is much broader than only focusing on the DPAs. I would like to hear a little bit more about the multi-stakeholder models that can be deployed to govern these frontier technologies. And Melvin, maybe I can start with a question to you in the context of fintech. What are some of the multi-stakeholder governance models that are working in this particular space?

Melvin Breton: Yeah, thank you, Sabine. I think with fintech, it’s particularly complex, right, because financial services are a very established area of regulation. And fintech comes and adds the technological layer on top of that and creates intersections. I was mentioning before with edtech, but with social media and many other environments in which data is being processed. So it needs to be multi-stakeholder if we’re going to have effective governance. You can think about, there are some examples of public-private partnerships that allow companies to opt in to some sort of data, more advanced data protection regulations in the context of a regulatory sandbox to see how that might work. And there are other sort of frameworks like open banking conglomerates that allow better sharing of information between financial institutions and the government that you can also bring FinTechs into to make sure that all the information is transparent and complies with data governance regulations. So the challenge really is that as you develop these technologies, you’re creating new tools and you’re creating new data that may not be covered by existing either financial regulations or data protections and data governance regulation. And if you have a very wide ranging data governance regulation, but there’s the financial sector operating in a sort of separate environment where data is not flowing from financial systems to the broader government, then you run into a problem where you have, in principle, data regulation, but you don’t know what you don’t know, right? You don’t know what… information is being generated through the use of these fintechs necessarily that may be covered in principle by the data governance regulation but may not be visible to the regulators on the data governance side and maybe not even to the financial regulator, right? So the multi-stakeholder model since this is such an emerging and rapidly evolving area, we’re seeing the successful use of regulatory sandboxes as I was mentioning before where companies can opt in to see how these processes of sharing information and sharing data can balance issues like privacy, governance but also the efficiency and effectiveness of some of these services and when it comes to children right now we are seeing very little in terms of regulatory initiatives in fintech that take children into account specifically mostly that’s happening at the level of data governance regulations and that’s where children are protected but fintech per se is not yet perhaps because the regulatory landscape is still maturing it’s not taking steps to to protect data related to children specifically so that’s that’s something that we would like to see, open bank and conglomerates, public-private partnerships, regulatory sandboxes for fintech companies to opt in and work closely with the government to see the intersection of data governance regulations and financial regulations and fintech-generated information and data in the future. So I’ll leave it at that, over.

Sabine Witting: Thanks so much, Melvin. I think we all see this as a very complex issue, and the more we dive into it, the more complex it gets, and I think you highlighted the importance of regulatory sandboxes as an innovative data governance model, and also the importance of public-private partnerships in this context. Of course, one player that is very important, especially also at a forum as the IGF here, is the role of civil society. Traditionally, many contexts of society are upholding the importance of human rights and children’s rights in this context. And Emma, maybe you can tell us a little bit about more, what role do you see for civil society in these various multi-stakeholder models for data governance for children?

Emma Day: Yeah, great question. And before I get specifically to that, I just want to loop back to this issue of regulatory sandboxes, because I think these come from the fintech sector, as Melvin is describing, but as part of this project on data governance for children that UNICEF is leading at the moment, we’re producing a series of case studies on innovations in data governance for children. And one of those case studies is going to look specifically at the role of regulatory sandboxes in data governance for children. And I think these are a very promising model of multi-stakeholder governance that could have great potential for the education sector. Now, we see that they’re usually used a little bit more narrowly by regulators, so often data protection authorities will put out a call for applications to the private sector, and private sector companies will then work with the regulator on some of these kinds of frontier technologies like edtech or fintech, or perhaps even neurotech, where it’s not clear yet how the law or the regulation applies in practice, because this is such a new technology. And then there is a set period of time and there’s an exit report, which is publicized usually so that other people in the sector can learn, other companies can learn what are the boundaries of regulation, and the regulator can then learn how they maybe should change that regulation. and move as the tech moves also. But I think what’s most promising is what we’ve seen. There is an organization called the Datasphere Initiative, and they’re looking at the role of regulatory sandboxes much more from this multi-stakeholder perspective. So including also civil society is the missing piece in these sandboxes, working together with regulators and with the private sector on these big questions about how to govern these frontier technologies. What is still missing though is involving children. We haven’t seen an example yet of a regulatory sandbox. There are some which are about children, but there are not any which actually involve the participation of children. And the other, I think, innovative aspect of this multi-stakeholder regulatory sandbox that the Datasphere Initiative is promoting is they’re looking at cross-border sandboxes also. So many of these tools, like edtech tools in particular, are used across many different countries, often they’re multinational companies. And so it’s really not a question for one regulator. And in fact, it’s much better for everyone if these kinds of technologies are interoperable and regulators can come together and tackle these questions together as much as possible, and also involve civil society as much as possible from the regions where this edtech will be deployed. So I think this is not yet happening to our knowledge within the education sector, but it seems to be a very promising model for the future.

Sabine Witting: Thanks so much, Emma. Lots of potential, as you can hear, with the different data governance models. And maybe let us pause here for a second, because I think this was already a lot of content. And if you were listening to Emma and wondering the whole time, what is a regulatory sandbox, also please, you see, okay. So maybe before we go into the first block of Q&A, maybe Emma, a quick explanation what a regulatory sandbox is. Thank you.

Emma Day: Yeah, so a regulatory sandbox is an arrangement usually between a regulator. So it could be, often it’s a data protection authority actually, because they’re usually about data processing. And so the data protection authority wants to work with the private sector to explore how the regulation should be put into practice. So if you think about in an example from EdTech, say there was a new kind of immersive technology that suddenly became available for education where children could become avatars and they could put on a glove and feel things, there would be some risks and some benefits, and maybe the regulator would want to explore those with the company. And so there’s always this question of trust, right? where the company is worried that the regulator is just going to bring an enforcement action against them. And so within this sandbox, it’s kind of a protective framework where the companies can explain the technology they’re exploring and the regulator can then have an interaction with them and tell them if the direction they’re going in is going to be lawful or if they’re gonna end up in a risky area. It’s still, in most countries, regulators still will not allow the company to experiment with something that is not lawful or that is actually prohibited by regulation. But it’s a way for usually a product that’s still in the development phase to get the guidance from the regulator on how to navigate that space forwards. I hope that makes sense. Yeah, absolutely.

Sabine Witting: Thanks so much, Emma. Yeah, so essentially before you unleash technology on lots of people, maybe let’s first try from a compliance perspective, what is it that we can do to avoid the most severe adverse impacts? So that’s the idea to then strengthen compliance once the product is on the market. So let me stop here. And you can also ask another question on regulatory sandboxes in case that wasn’t clear. So let me maybe give the opportunity to people in the room on these first two blocks to ask any questions pertaining to what we’ve heard, risks and benefits with regards to these technologies, governance models and multi-stakeholder models. Any questions from the floor at this point in time? Yes, there in the back. Do we have a running mic? Yeah. Sorry, can I take yours? Yeah, yeah. Thank you so much. Thank you. Yeah.

AUDIENCE: Thank you. This is to Emma. Emma, you mentioned about regulatory sandboxes. Have you seen, I know, which countries or which regulators are great examples to follow?

Emma Day: Thank you. So from what I’ve seen of this particular model of multi-stakeholder governance, which includes civil society, the focus has been in Africa on health tech. And there have been cross-border regulatory sandboxes that the Data Sphere Initiative has been coordinating. And so the Data Sphere Initiative is a third party, which maybe also makes it easier that it’s not the regulator who is actually leading the sandbox, and they bring all of the different stakeholders together. The regulatory sandboxes that we… We see more within Europe, generally more just the regulator with the private sector without that civil society piece so far. But if anyone has any examples they know of that they want to share, we’d also love to hear more about those.

Sabine Witting: Thanks so much. Emma Jutta, please.

Jutta: Yes. Jutta from the Digital Opportunities Foundation in Germany. My question goes to Malcolm. Probably it’s also interesting for the person that was talking about ad tech. I just think that the data of children in the fintech sector are of huge interest because they will be the customers of the future. And we’ve been talking about privacy, but what about security of these data? How do we make sure that these data are not exploited for any purpose that we don’t want them to be? Thank you.

Sabine Witting: Thanks so much, Jutta. I think for Melvin. Melvin, maybe you want to start and then Aki, if you want to add anything to that.

Melvin Breton: Sure. That’s the million dollar question, right? I think if we knew how to prevent these data from being exploited and used for nefarious purposes, we probably would be doing it already. I think there is an intense tension between innovation and development of new technologies and new applications in the fintech sector and the protection of data related to children. It’s also not clear cut because a lot of the use of financial applications is not necessarily happening in fintech apps, but it’s happening in social media apps that have payments enabled or where you can purchase certain items. It’s happening in games where you have in-app purchases. and loot boxes and all these things that you can purchase from within the game and that don’t necessarily require multiple instances of approval from a parent. So you set it and forget it in a way and then you have the credit card data or whatever payment form that you have and then you run with it. And then there are a lot of transactions that are being carried out by children in platforms and apps that have the parent’s information data. You can think about online shopping platforms where children often have access to their parent’s account to purchase this or that item. So that’s to say the information that is generated and collected about children and that is generated from children in financial applications and financial technologies is scattered. I think regulatory sandboxes for fintech applications are a good first step to see how we can develop ways of collecting that dedicated information that’s being generated in the context of the fintech apps and services. We’ll see how that develops. Then there are, as I was saying, the other financial applications of technologies that are not necessarily fintech apps where the conversation is part of a broader conversation related to the data that’s being generated and used in those other applications. I mentioned games and I mentioned social media. There’s currently the debate about the Kids Online Safety Act in the US. What are the, I don’t know that there’s a lot of focus on the financial aspect within that legislation. How can we pay more attention to financial applications and financial transactions that kids are carrying out outside of dedicated FinTech apps at the same time as we use regulatory sandboxes to try and regulate that within the dedicated FinTech apps? I think that’s gonna be a big question. And that’s to not even mention crypto blockchain, decentralized finance, which is perhaps another kind of warms. So I’ll leave it at that for now.

Sabine Witting: Thanks so much, Melvin. I think more questions now, but I think one point was very important is that because some of you might’ve wondered like how often does a child actually make a bank transfer on an app? But I think that aspect what you were mentioning about how FinTech is embedded in typical digital environments where children are engaged. I think that was a very important point. And then to think about in a second step about data processing and also secondary data processing and all the problems that come with it. I had then two hands up on both sides. Let me give to Steve first and then to Emma. No, you didn’t want to? Oh, sorry. Okay. He’s like, Emma, please.

Emma Day: Thanks. So just, I wanted to come back on this point about cybersecurity, which I think is a really important point. There’s a big part of this discussion that what we’ve been seeing, we’ve been interviewing regulators around the world. So data protection authorities, and it’s clear that really in every country, it’s very common that at a school level, there is a big security breach and children’s data is leaked. And even at levels of ministries of education. So when we’re talking about the benefits of sharing all of this data, it’s not really something an ed tech company can necessarily, the problem may not be with them. The problem may be with the school or with the government in terms of the cybersecurity they’ve put in place. So we need to, that’s a big part of the picture to enable it to be a safe and trusted environment to implement these new technologies.

Sabine Witting: Yeah. that comes with accountability for all of the stakeholders that are involved in the deployment of these technologies and clear roles of who should be held accountable and how. So any other questions on these topics at this point in time from the floor? Also online, I don’t think we have any questions online. Any other questions from the floor? No? All right, wonderful. So then let’s move on to the next two blocks. So we spoke about the risks and benefits. We spoke about governance models. And of course, we can’t say governance without saying law and regulation. So let’s look at that next. So when we are looking at these kinds of emerging technologies, of course, the classic conflict comes up. How does law and regulation keep up with that? Technology is changing all the time. Children’s vulnerabilities in this context are changing all the time. So how can we address these? And maybe, Stephen, you can tell us a little bit about more. What do you see in the context of the legal and regulatory framework? And how does it apply to the field of neurotech, which is the kind of third domain that we haven’t spoken about yet? But Stephen, maybe before you go into the regulatory context, maybe explain quickly what neurotech is and how it impacts children.

Steven Vosloo: Thanks, Sabine. That’s a great lineup. Thank you. And good point, because not everyone knows what it is. So very quickly, neurotechnology is any technology that looks at neural or brain signals and the functioning of the brain or the neural system. So it could record those functions. It could monitor those. It could modulate or even kind of write to. I’m a computer scientist, so I must kind of write to the brain and write to brain data and make some neural changes. And so it could impact children in many ways. I’ll talk a little bit later about, let’s say, neurotechnology in the classroom to help monitor levels of concentration, for example. And so that’s kind of monitoring brain activity, and we’ve seen examples of this in some classrooms around the world. So just one other thing on that, neurotechnology is either generally, the technology itself is either invasive or non-invasive. And so the invasive side is what you may have seen with very severe neural disorders like quadriplegics, who actually have a chip implanted in the skull, kind of on the brain. And so with their thoughts, they can move a mouse or communicate or kind of interact with computers. So it gives an incredible amount of agency and autonomy to people who otherwise are physically paralyzed. The other side is non-invasive. And this is actually where the space is going to probably go more and impact children more. So this is less accurate than the very heavy kind of medical, clinical invasive side. But it’s also less invasive. You know, it could be a headband that you wear, so it’s much easier to kind of buy this technology. And again, it could look at your levels of concentration or so forth. So you asked about the laws and regulations. Neurotechnology is not advancing in a regulatory void or vacuum. We have existing regulations, existing laws, including the Convention on the Rights of the Child. The question is, do they apply to this frontier technology? And so we see, for example, in the UK, the ICO, which is the Data Protection Authority, looking, has done some research into looking at existing laws within the UK to see if they provide cover for neurotechnology. And they’re in the investigation phase. And the same is happening in Australia. The Australian Human Rights Commission has been investigating, you know, does the existing regulatory framework cover neurotechnology? So then what is the answer? And we’ve been thinking of two camps and I’ll give you some examples. In Europe, for example, the European Parliament also did an investigation and basically found that they think the existing laws and frameworks do provide enough cover. So there’s the EU Charter on Fundamental Rights and Freedoms and there’s the European Convention on Human Rights. And then what we know in the context of data governance, particularly the GDPR, which probably broadly applies to neurodata, because I should have said earlier, you know, any kind of monitoring of brain functioning translates into data essentially. That’s how you record it and that’s how you analyze it. And so there’s GDPR. There’s also the European AI Act that’s coming into effect soon, which doesn’t speak about neurotechnology directly, but for example prohibits the use of emotion detection AI in the workplace and in the classroom. And that would often be captured by a neurotechnology. And here we see, I also should have mentioned, a real convergence of technologies and that’s what complicates the space more, because neurotech is not new. It’s been around since the 70s, but it’s recently that it’s really made advances and that’s in part due to advances in AI and the ability to process large amounts of data that’s getting captured. So other countries have said no, the existing laws don’t provide enough cover, they need to make some changes. And these especially come from Latin America. In Chile, for example, there was a constitutional amendment in the last two years that really picked out the sensitivity of brain and neural data. And there was a world first case recently, or you know, fairly recently in Chile, where there was a commercial neurotech product And somebody bought the product and they said they’re not happy with the terms and conditions in the product where you don’t quite know where your data is going to and who’s processing it and who are the third parties. And that went all the way to the Supreme Court and the Supreme Court judged that the Neurotech company needed to cease operation until it kind of addressed that whole. is introducing a law that will a broad law that will result in 92 new articles and 35 amendments to existing laws because they also didn’t think that and these are health laws these are across a range of sectors didn’t think that the existing space provided enough cover for novel kind of issues around Neurotech. And then lastly in the US two of the states California and Colorado have updated their data protection the kind of personal data privacy data protection regulation to really pick out neural data and brain data and there the FTC which is a consumer protection body has also gone after some companies more actually for misrepresentation where companies say this product will help you can read your brain data and help you to do X and it can’t really it’s still too rudimentary and so it’s misrepresent misrepresentation. So I’ll close there just to say that some countries feel there’s enough cover others don’t and it seems to be landing in different ministries and and kind of looked at through different lenses. Our recommendation is that all countries should do a policy mapping exercise to look at what’s is at the national level and look at the opportunities and risks and emerging use cases from Neurotech and whether there is sufficient protection and cover in place.

Sabine Witting: Thanks so much for that explanation also the different examples and I think you were also speaking about convergence of technologies and I think it’s also convergence of regulations that we see right and and how they can be applied. and what gaps we have. And I think at a very practical level, what you said last is, there are different ministries involved and who is going to lead now, law reform, but also implementation of the laws. Is it, for example, in the concept of neurotech, is it the Ministry of Health? Is it a data protection authority? Is it a communications regulator? Are these three all working together? You mentioned in the EU, the AI Act, and how does the AI Act apply together with the GDPR in this context? So I think it’s exactly that. It’s that mapping exercise first to really understand how these regulatory mechanisms all interact. Emma, maybe to say, okay, if we recognize there might be some gaps, even though we might look at convergence of different regulatory frameworks and we pull everything we have together, we still have gaps. How are we gonna fix this?

AUDIENCE: I think Stephen’s recommendation is a very good one, this mapping exercise, first of all, to see where the gaps are. I would say in terms of edtech, I think it’s less about gaps, actually, and more about implementation. And so I think you can have gaps in putting the frameworks in place, but definitely maybe even a bigger gap in terms of implementing the regulations we already have. So in the context of edtech, really the edtech that’s being used at the moment is generally still to do with data protection and perhaps AI regulation, of which we now around the world have quite a lot of regulation. Maybe if we’re looking to the future, there will be neurotech embedded in the edtech and it’s gonna become then all of the issues that Stephen raised. But I think that’s where we at the moment need to do the work is on the implementation. And if you think of edtech, education in many, many countries around the world is a devolved responsibility. And when it comes to the choosing edtech products to be used in schools, it’s often teachers or the school management who will choose what products are gonna be used at the school level. And they need guidance to be able to make these choices. they have to think about, is this a good tool for education, what about data protection, what about cyber security, what about AI ethics, and so I think the gaps here are a little bit like Aki was talking about, they’ve been developing in Finland this kind of guidance, some of the key kinds of tools that can be used for this are procurement rules, where governments decide that if schools are going to procure edtech to use in a school, then they need to meet certain requirements for data protection, cyber security, and even educational value also, they can be, like Aki was mentioning, certification schemes, so that an edtech company has to be audited and then they’re certified that they meet these minimum standards, and industry also can create standards, and there can be guidance and codes of practice, and we know that some regulators are starting to work on this for schools, but this is really an emerging area, and I think it’s a gap everywhere, that maybe there’s also room that every regulator doesn’t have to start from the beginning, that there can be some common themes and regulators can learn from each other, for example, the Global Privacy Assembly has been working with UNICEF on this project of data governance for edtech, and different regulators from around the world are coming together through the Global Privacy Assembly to look at what the common challenges are, and maybe what some of the common solutions could be as well.

Sabine Witting: Yeah, thanks so much, and I think that’s a very important point, it’s not so much, you know, usually we think, oh, there is a regulatory problem, we need law reform, but oftentimes more law and more specific laws, and let’s say, oh, we need a neurotech law specifically, it’s not going to solve the issue, because the issue usually lies in the implementation and the application of the existing legal frameworks, and also what you said around procurement rules, and I think looking at these different aspects, for example, of edtech, one of the things would also be to, for example, say, you need to also conduct a data protection impact assessment. as part of that, right, for schools to really actively think about and to point them towards the risks associated with edtech, because they might just not be thinking about that at all. And also, as you mentioned, the kind of joint thinking through bodies like the Global Privacy Assembly, the IGF and others, how we can really move forward in these kind of spaces. Jutta, I see a question. Please, please come in. Sorry, can we have a microphone? Oh, yeah. Jutta is on the move. Thanks, Jutta. Stephen is on the move. Stephen, come to rescue. There we go. Thank you.

Jutta: Yes, I just wanted to refer to Section 508 in the US law, which was introduced, I do think, 20 years ago, making accessibility a precondition for any procurement. And if we would have that for all the technology that we’ve been talking about, making child rights assessments or child safety assessments in procurement, that would be a good recommendation. Thank you.

Sabine Witting: Thanks much, Jutta. Yeah, because I think it brings the problem much closer to the people who actually deal with it, right? And because it is not just an abstract data protection issue, it becomes a procurement issue. And a procurement issue is what schools deal with. And they know procurement and they know rules around that. So if you bring the abstract issue of data protection down to that level, it’s much more likely that people actually think about. So thanks so much for that point. Any other questions on this particular block around regulation? What’s your experience in your country around that? Do you see regulatory frameworks? Do you see implementation gaps? What might be required? Any points from the floor? Otherwise, any other examples? Yep. He’s moving. Very good. Go ahead, please.

AUDIENCE: This isn’t actually an example, but it’s more just to say how challenging the space is. So I really like that point, Jutta, about bringing in a condition for procurement. And in the US, the government is such a massive buyer of ed tech that this really has, that really has teeth and that can move the needle. This is more of a challenge. On your last point, just after me about convergence, the thing that government ministries and regulators do so badly is work outside of their silo. We all do it badly, even within departments within UNICEF. So I’m not pointing fingers. I’m saying it’s a real challenge to all of us when you get technologies or issues like data governance that touch on neurotechnology. Is it an education issue? Is it a health issue? Is it a data governance, data protection issue? So it’s really going to challenge all of us to kind of think outside of the box or think outside of the silo and work together. Yeah.

Emma Day: Just another challenge I see is that I think there are different challenges in different geographies of the world. And there are some countries who are still struggling with access to internet. So I think equity is a big challenge. So in terms of ed tech, you talk to some regulators and really they’re trying to make sure that every school has access to education and has access to the internet. And if you’re talking about immersive technologies, the reality is there is not the infrastructure to support this in most schools in many, many parts of the world. And then for many regulators, they’re not financed. They don’t have the resources to have that kind of oversight, often over foreign companies who are deploying their products in their country, possibly financed by development aid as well. It becomes quite a complicated picture. So I think that’s where we also need to look at this multi-stakeholder governance model and think about who are all those actors who we need to include and make sure the procurement may or may not happen at the national level in all countries. It may happen actually from a donor as well. So there are different actors who need to be brought into these discussions, I think.

Sabine Witting: And I think what we also see, I think in the global south context, is there is a competing interest right and I think from my experience what I’ve heard from from many schools is that they say well like data protection issues yeah there might be risks but there’s really not something that we can prioritize to prioritize data protection because a much more tangible issue here is access to education that’s what we need to deal with first and I think always loops back to the problem that children’s data governance is an abstract issue it’s nothing that a lot of people really see what you know and not understand and I think that’s why it’s easily pushed aside and rather than really considered within the CRC as they’re also equally competing right oh yes yes Milo sorry please interrupt me anytime go ahead

Jasmina Byrne: thank you so much Sabine I was just listening to this discussion about regulatory frameworks and various stakeholders and I wanted to say sometimes these policies or strategies that come from different divisions departments in the government and so on could also help us advance any any any potential work on on data governance and I’m now thinking about digital public infrastructure that is actually an approach being adopted by so many countries which actually facilitates the government services and a layer of these platforms that are set up on this digital public infrastructure which includes financial payments and includes data sharing and it includes digital IDs and when different governments in collaboration with ministries private sector as well are developing these strategies this is where we also need to be vigilant to think about how these data sharing practices can impact children at all levels there are currently about 54 of such strategies in place and there is a big push for an adoption of digital public infrastructure across the world. So to answer to your questions, Sabine, where are the good examples? I think we probably need to look much more closely to see how to engage with those stakeholders who are advancing DPI in their countries and regions to think about data governance as well across different domains. Thank you.

Sabine Witting: Thanks so much, Jasmina, for that intervention. There’s another question in the back. I think we don’t have a microphone. Thank you.

AUDIENCE: Thank you. Just building on from what Jasmina said and following on from what Eman said as well, when we, you know, where are the best practices? That’s important. Another area that I want to emphasize is, you know, there is operational activities like skill capacity building when it comes to educators, right? How do they know what is, what does good look like? And then when we look at the strategy, that’s at a different level altogether that we need to think about. So it’s, I think, I don’t have the answer, but just an observation. And in different parts of the world, so I come from Australia. Well, Australia has been strong enough to advocate, you know, child rights and standing strong against matter, but it’s not all countries who can do that. So it’s an interesting or challenging area, but I think an area that we all have to collaborate together so that I think that collaboration piece plays a role, a very strong role, as well as where are the best practices. Thank you.

Sabine Witting: Thank you so much for that intervention. And maybe, Emma, do you briefly want to speak a bit about the case studies that are looking at these kind of innovations?

Emma Day: Yeah, I think in terms of, I think what you’re saying is right, and again, it comes back to this question of resources, really. And I think in no country can a regulator, like a data protection authority, have oversight over every tech company that’s operating in its country. It’s just impossible, really. But I think that’s why, then, we’re looking at innovations in data governance to try and see what are some examples of how you plug those gaps. So we will publish next year, it will be a UNICEF collection of innovations in data governance for children. And some examples, we had the regulatory sandboxes, but also the certification schemes. So certification schemes are generally led by a non-profit or even by a company themselves. And it’s a way of, I suppose, outsourcing some of that oversight. And you always have attention because you can get the commercialization, then, of the certification schemes. So it has to be done. properly and we’re trying to look at some examples and this case study will then try to look at some of the considerations. It’s quite difficult to find shiny example best practices. We often start looking for those and then we end up looking at promising practices and take a little bit of what seems good from different examples. So I think in these case studies that’s what we’ll be doing is looking around the world and if anyone has any ideas along these themes they want to contribute we’d love to hear from them and the other the other case study we’re looking at at the moment is on children’s codes. So looking at there is a UK age-appropriate design code, Ireland has produced a similar code and then there are other codes developing in Indonesia, Australia, look which in these codes generally actually So if this is kind of our way of looking for best practices or promising practices and getting those out there and sharing them.

Sabine Witting: Somebody said online that the captioning has stopped working.

Melvin Breton: I think it’s back, it went out for a little bit and now it’s back. Could I ask a question? So in the theme of regulatory authority we have all these different tech domains and we have one issue that cuts across all of them which is data governance and data regulation. I think something that could be explored is how can we explore is empowering the regulatory bodies, data regulation, data government, governance authorities, a lot more within the government. Because if I’m thinking about fintech, you have very strong financial regulations in many countries and financial regulatory bodies. It’s not so clear that they look at the advice from data governance authorities, but those data governance authorities often have such a wide remit that it’s very difficult for them to give direction that’s tailor-made for areas like fintech. So encouraging collaboration from the financial regulatory bodies with the data protection authorities to develop more tailor-made regulations on data governance for fintech, for neurotech, for edtech, for whatever the case may be, might be a good first step. And then once those regulations are well established, making them more binding. Because it’s one thing, the financial regulatory body regulating fintech, but they may not be applying regulations directed at protecting children’s data beyond what I think is now accepted as the norm, which is like the data needs to be encrypted, data needs to be anonymized. But beyond that, it’s not super clear that the data protection regulations are very specific to children’s needs in all these domains, across all these domains.

Sabine Witting: Thanks so much, Melvin, for that. And I think, yeah, I see lots of nods here next, left and right here. You want to add something, Emma?

Emma Day: Yeah, maybe. I think it’s interesting then the enforcement side of things, and different regulators have very different approaches to this. So some regulators see themselves as being kind of collaborators with the private sector who really want to, they’re kind of balancing this approach of promoting innovation in their own country, in their own tech ecosystem, and also making sure that the tech companies don’t overstep the mark too much. But often, from that perspective, the regulator will meet with the companies and kind of warn them verbally first. In other countries, the regulators are much more, take a punitive approach where they will directly it’s more about bringing enforcement actions and they’re not very approachable and there are pros and cons to each. In other countries, particularly like we were discussing before, where it may even be a foreign company that’s the problem in the country, there are few resources and it’s very difficult to know actually how technically this would happen, where would be the jurisdiction, how will they hold this company accountable in their own country. So there are definitely issues related to enforcement and accountability as well which probably deserve a whole other case study just to try and unpack.

Sabine Witting: Thanks so much Emma and I think this was a very rich discussion, very interesting block around laws and regulation. What actually does a gap look like? Do we have a gap about convergence of technologies, convergence of regulatory frameworks, implementation problems and then Emma also what you said about best practices, promising practices and maybe only practices. So we’re changing the bar I guess as we go but it’s a learning space and we need to think outside the box all of us. So maybe after looking at the risks and benefits, governance models, laws and regulations, maybe that was very much looking at the status quo, maybe we can close the session by looking ahead a little bit and look at the next 10 years, 15 years and these different frontier technologies, edtech, neurotech and fintech and really think about what might be the upcoming issues in terms of data governance because of course we already need to think ahead, predict things and find solutions as we go forward. Maybe Aki I can start with you, just some concluding thoughts on that.

Aki Enkenberg: Yes, thank you. I think it’s been a very interesting discussion so far and Already, I think many of the issues related to the future of these fields and how they should be or could be governed have come up. So maybe we can also build on those in this final segment. I do agree with, I think, Stephen, who raised this issue of convergence earlier, which makes it quite difficult to predict or make predictions about where neurotech or edtech or fintech will evolve or go in the next five to 10 years, because they interact with each other, right? So they merge into each other. And out of these combinations, different fields will emerge, different problems will emerge, and so on. Definitely, that’s one key point to watch. Secondly, we can think about technology on its own, and often it’s very useful to kind of make these kinds of predictions. But we also should keep in mind that it doesn’t evolve autonomously, so it’s also governed and constantly being steered by governments and other stakeholders in the process. So we should definitely also think about, at the same time, whether we want the technology to evolve and how we can be part of the process and what role governance plays. On neurotech, quite an interesting field. I think we’ll see a lot of unexpected things over the next five years, even. In addition to these leaps in kind of measuring brain activity or neural activity, definitely there’ll be a growing focus on acting on humans, acting on the brain or stimulating the brain. new interfaces also for doing this. Many of us have heard about the Neuralink, but that’s only one example. I think there’ll be a whole explosion of these kinds of interfaces, how humans and their brains will be interacted on or acted on. So definitely in the clinical field, there’s a lot of potential for these technologies, and that’s already proven, but they will also trickle down to consumers eventually in different kinds of context. And one discussion point today about the convergence of neurotech and edtech will be quite important to follow, how these technologies eventually come to schools and classrooms to monitor learning or behavior, but also to stimulate learning. And certain type of behavior is quite interesting, but also quite controversial, I’m sure. The downsides also from the kind of interface of neurotech and AI, this risk of unconscious influencing for political purposes, for commercial purposes, for marketing, advertising, or changing people’s minds, influencing them when their brains are still evolving in the case of children and youth, extremely important to keep in mind. As Stephen mentioned, the EU AI Act already recognized this danger, and definitely when it comes to regulation at this point in time, it seems to be wise to focus on the risks posed by specific uses of technology. It will be very difficult to govern or prohibit certain technologies or allow other technologies per se, but it will be possible to govern how they’re used. and applied, and the EU AI Act through its approach is a good example of this one. On fintech, finally, definitely in my mind at least, this kind of financialization of everything and embedding of financial services or financial angle in every other type of digital service we consume or games or entertainment, social media, etc. So definitely moving from this the situation where fintech, we regard fintech primarily as a new means for making payments, saving, investing, in the future also more and more about lending, to a world where financial services will be part of every other thing we do. And of course combined with the very likely scenario where everyone will be quite easily identified online also through digital identity systems, this KYC or Know Your Consumer problem will be less important than it is today. People can be recognized online, their identity is known, and that they’re conducting financial transactions or so on everywhere where they go and through different means, not only through specific apps or banks and so on. And then finally, definitely we’ll move into a world where not only our kind of behavior and choices and actions that are visible will be measured and tracked, but also our bodily activity and brain activity more and more. And this will become a focus for data governance also. And when we think about how AI is also developing, we’re trying to create… these independently acting AI agents that are currently sort of learning from what exists, the data that exists and is available online, but in the future they will also, I mean there’s a need for them to also, these systems to learn from humans directly, from their activities, behaviors and their thoughts and so on. So our data, our bodily data, our brain data will become commercially crucial or important for this endeavor. So this really highlights the role of personal data and bodily data in the future in data governance. And then finally, was it Jasmina I think who mentioned this issue or Emma on the global divide. So whereas we’re in the global north, we’re trying to keep pace with technology and also develop very advanced regulations to tackle some of the issues we see, we do have to keep in mind this need to develop a level playing field globally and really to address also not only the technology divide, but also the regulatory divide. So these are my thoughts. Thanks.

Sabine Wittingg: Thanks so much Aki. Well, Stephen, good luck following that. Maybe just some concluding thoughts for this very comprehensive analysis now.

Steven Vosloo: Thank you Aki, that was excellent. Yeah, I don’t have too much to add. Aki very eloquently highlighted the technological use cases, but also the broader issues. Maybe I’ll just pick out one quick thing. So on neurotech anyways, this move from neurotechnology beginning in the medical space that’s highly regulated and has ethical oversight, now moving into the consumer space. And in many countries, consumer electronics devices don’t on subject to that level of oversight. So there’s clearly a gap there. And there’s from a data governance and just protection perspective, there’s a huge area to focus on. But in terms of where the space is going in the consumer side, anyways, we will definitely see in the education space, that’s come up a lot. And this isn’t just me speaking, this is through consultations we’ve done with neurotech experts from around the world. So in the classroom to kind of support learning and the opportunities and risks that comes with that. But in the home space, the cognitive enhancement is also an element of that. area to really watch. And so this is not where you have a neural disorder, where you get treated through neurotechnology. This is where you’re healthy, but you can perform better. And in our consultations, people from certain countries that are highly competitive in terms of getting into universities and so forth, where already you pull all the levers you can to advance your child, whether it’s through tutors, whether it’s through medication, whether it’s literally, you look at all your options. If neurotechnology promises that, that is something that people will look at. And if it works, it comes back to the equity issue of AKI. So how do you compete in the global south against your peer in the global north who’s just performing so much better? So that kind of touches on not just treatment, but also enhancement. In one of the consultations, one of the folks said something that was really great. He’s from Zimbabwe, and he said, you may get a future world where you get those, the treated from neurotechnology for disorders, and you may get the enhanced who are healthy. And then we added in the group, the naturals. And this could be the future. Anyway, we’ll leave you on a controversial note.

Sabine Witting: Very good. Thank you so much, Stephen. Emma, controversial note.

Emma Day: Well, I think mine might be controversial in a slightly different way. I’d like to go back to what Aki said about, there’s obviously a trajectory of the development of technology, but we are governing how that continues into the future. And I think sometimes there is a kind of inevitability that we hear about the direction that technology will evolve in, and that we’re all going to end up with chips in our brains. But I think that these are decisions that we make, and we can decide what’s in the best interests of children for their education. We can put the guardrails in place, and we can maximize some of the benefits that are being promised here. But also, we can decide not to end up with chips in our brains if we don’t want to, at the really extreme of that end point. I think there’s also, just really focusing on edtech, I think some of it is also to do with geopolitics of how this develops. We’re seeing at the moment, quite a monopoly by American and Chinese tech companies. There are a couple of big tech companies who deploy their edtech kind of more infrastructure around the world really and then at a national level you see in most countries in the world there is an ecosystem growing now of apps that plug into those big company platforms for things like language, mathematics and they’re more culturally and linguistically appropriate and maybe those ecosystems are going to grow more and you also see within Europe there is the Gaia X project at EU level which is being led by the German government and the aim there is to try and find European level solutions based on secure and trustworthy exchange of educational data so that they don’t have to use the big tech companies for edtech so it depends how all of that plays out and we don’t really know what direction that’s going to move in but it’s likely to also have an influence on the kinds of technology we see and the values that underpin those technologies as well I think. Thanks Angela for that very good point. Melvin?

Melvin Breton: Sabine, thank you.

Sabine Witzing: Tell us, more problems.

Melvin Breton: More problems, no I think it’s useful maybe to think about it in terms of the extensive future and the intensive future in terms of fintech. I think Aki already alluded to some of the extensive future in the sense that I’m using it here where we’re seeing fintech across an increasing range of domains. We started by just having a web or app layer on top of financial services and now we’re seeing it getting into gaming, getting into social media where there are obvious financial applications there that are relevant for children that we haven’t yet completely come to grips with in terms of regulation and data protection beyond maybe encryption and anonymization which is still not even applied across the board but at least we know those two things are important but then we’re getting into other things like the metaverse which is maybe an extension of games to just like social life and in a parallel world where there will also inevitably be transactions and we’re already seeing things like NFTs and digital land that you can purchase and what kind of implications does that have for children and data and you’re seeing also in social media that there’s financial transactions are becoming public and another source of information about the lives of children that is becoming more more prevalent so what what are we going to do about that I think those are very much open questions not to even not to even mention neurotech which is I think scary to think about the the prospect of the intersection of neurotech and fintech but something that we need to keep in mind nonetheless I think there are some good news there’s I think age detection through AI for purposes of age gating is getting a lot better I think now companies say that they can detect a person’s age to plus minus one year roughly just through their the use of AI but then that opens the question the question what else does it know about you in terms of your financial life and and the transactions that you’re likely to make and what potential does that open for manipulation and exploitation of children. There’s also the AI and fintech intersection front, there’s the algorithms are getting a lot better, for example, for deciding who to lend to, banking services using that to be able to process more applications for loans and things like that. That has consequences for financial inclusion, it enables more financial inclusion of families that previously maybe didn’t have access to financial services. The technology itself allows people to, or the technologies themselves allow people to be more integrated into the financial systems, so that’s a plus for financial inclusion, but then those same tools, if you’re thinking about AI or machine learning algorithms used to decide who gets and doesn’t get a loan, that can also have another edge, which is that it can lead to financial exclusion because it’s a lot easier to see who has a risk of becoming non-compliant. So the inequality aspect here is important. Also to mention that whatever applications that require connectivity will just compound the digital divide that already exists, so something to think about there. On the positive side of applications, I think social protection and transfers, cash transfers. for social protection are going to benefit immensely from these new technologies that are becoming more efficient, fewer data requirements, more points of entry, and as things like central bank digital currencies, stable coins, and things like that become more prevalent, it’s going to make it a lot easier to expand and scale up social protection systems and transfers, again with the caveat of, you know, we need to be conscious of the digital divide. And then on the education front, financial education is going to become, yes, just wrapping up.

Sabine Witzing: Yes, thank you.

Melvin Breton: Yes, financial education is just going to allow for a longer financial life. Starting earlier in your financial journey and becoming more savvy is going to be something that’s going to be beneficial for children. But again, a pinch of salt that we need to be careful about the risks. Over.

Sabine Witzing: I also love how you just kindly brought in the metaverse and stable coins. Yes, Mina, please answer all of our questions now in the last two minutes.

Jasmina Byrne: Thank you so much. I mean, it’s been a great pleasure listening to all of you and so many fantastic contributions and ideas. And we talked about integration of technology and regulation and stakeholder, multi-stakeholder approaches to these issues. And when we talk about the future, obviously, we need to think about how some of these technologies are going to be evolving. EdTech is already much more mature. The challenge is going to be the size of the market. How do we capture everyone who is introducing some EdTech tools to the market, but also piloting of new technologies? that is happening, how are we also trying to work with those companies who are testing and piloting new approaches and new technologies. In the financial sector, we heard from Alvin also that includes integrating blockchain, crypto and so on. And basically AI integration into everything that we are going to be seeing more and more in the future. I think what is going to be a big challenge for all of us is the global fragmentation of regulation, which can lead to uneven safety standards and standards for children in particular. I think that fragmentation can potentially lead to a lack of trust in these technologies and their adoption and application for good, as we said in the beginning that there are so many benefits. So the question is also for us who are working for children and children’s rights in the context of digital technologies, is how do we even shape the future of technology? How do we use this knowledge and this understanding of implications for children to shape the development? And somebody was mentioning, I think Jutta was mentioning also standards or recommendations for even procurement of some of these technologies and maybe going even back towards the development of these technologies and integration of child rights principles into development of these technologies. We also need to think about the future of regulation. So when we talk about the future of technologies is one thing, but also what are going to be the future approaches to regulating technologies and how do we strike that balance between innovation and protection? We talked a lot about benefits, we talked about risks, but then ensuring that the future regulation strategies, policies actually accurately. in a way, create that balance and maintain that balance and allow for innovation while at the same time safeguarding children. And I just want to end on the child rights note. We haven’t mentioned so much children’s rights. Many of you, particularly online, have worked over the past several years on really integrating child rights into any kind of tech policy. And we heard from Aki the opportunities under the digital compact to integrate more effort in relation to children’s data governance. So I would just like to remind everyone again that children’s rights are comprehensive, but also they need to be looked at both from the positive and protection side. And when we think about the future of tech and future of technology, that holistic child rights approach, I think is the best way forward. Thank you so much.

Sabine Witting: Thank you so much Ms. Yasmina for wrapping up and thanks so much to the audience here in the room and online and the speakers for a fantastic panel and enjoy the rest of your day. Good evening to the people here in Saudi Arabia, I think. And we will see you all tomorrow here at the IDF. Thank you.

Jasmina Byrne: Thank you.

E

Emma Day

Speech speed

0 words per minute

Speech length

0 words

Speech time

1 seconds

Personalized learning potential of EdTech

Explanation

EdTech has the potential to provide personalized learning experiences for students. This can be achieved through algorithms that learn from individual children’s data and tailor their learning to suit their personal needs.

Evidence

Data from these tools can be shared with teachers to help identify students falling behind or ensure equity for different groups of students.

Major Discussion Point

Benefits and Risks of Emerging Technologies for Children

Need for multi-stakeholder governance approaches

Explanation

Data governance for emerging technologies requires a multi-stakeholder approach. This involves collaboration between regulators, private sector, and civil society to address complex issues in data governance for children.

Evidence

Example of the Datasphere Initiative looking at regulatory sandboxes from a multi-stakeholder perspective, including civil society.

Major Discussion Point

Data Governance Models and Implementation

Agreed with

Sabine Witting

Agreed on

Need for multi-stakeholder governance approaches

Importance of regulatory sandboxes for innovation

Explanation

Regulatory sandboxes provide a protected framework for companies to explore new technologies under regulatory guidance. This allows for innovation while ensuring compliance with data protection and other relevant regulations.

Evidence

UK ICO’s sandbox project with the Department of Education to enable children to share their education data securely with higher education providers.

Major Discussion Point

Data Governance Models and Implementation

Agreed with

Melvin Breton

Agreed on

Importance of regulatory sandboxes

Implementation gaps in applying existing regulations

Explanation

The main challenge in edtech regulation is not necessarily gaps in the law, but rather implementation of existing regulations. This is particularly challenging at the school level where decisions about edtech are often made.

Evidence

Example of teachers or school management choosing edtech products without sufficient guidance on data protection, cybersecurity, and AI ethics considerations.

Major Discussion Point

Regulatory Frameworks and Gaps

Differed with

Steven Vosloo

Differed on

Approach to regulation of emerging technologies

Geopolitical influences on EdTech development

Explanation

The future development of EdTech is influenced by geopolitical factors. This includes the current monopoly of American and Chinese tech companies and efforts in Europe to develop alternative solutions.

Evidence

Example of the Gaia X project at EU level, led by the German government, aiming to find European-level solutions for secure and trustworthy exchange of educational data.

Major Discussion Point

Future Developments and Challenges

M

Melvin Breton

Speech speed

111 words per minute

Speech length

2374 words

Speech time

1277 seconds

Financial literacy enhancement through FinTech

Explanation

FinTech can be used to enhance financial literacy from a young age. Better data collection and processing can provide personalized feedback to help children develop good money management skills.

Evidence

Examples of personalized nudges alerting children about overspending or encouraging healthy saving habits.

Major Discussion Point

Benefits and Risks of Emerging Technologies for Children

Potential for manipulation and exploitation in FinTech

Explanation

FinTech also presents risks of exploitation and manipulation for children. This includes the potential for bad actors to target children or promote overuse of financial technologies.

Evidence

Examples of alarming cases with stock trading apps addressing mental health issues and harms to young people.

Major Discussion Point

Benefits and Risks of Emerging Technologies for Children

Need for collaboration between financial and data regulators

Explanation

There is a need for increased collaboration between financial regulatory bodies and data protection authorities. This collaboration is necessary to develop tailored regulations for data governance in fintech, particularly concerning children’s data.

Major Discussion Point

Data Governance Models and Implementation

Agreed with

Emma Day

Agreed on

Importance of regulatory sandboxes

Expansion of FinTech into new domains like gaming and metaverse

Explanation

FinTech is expanding into new domains such as gaming, social media, and the metaverse. This expansion raises new questions about data protection and regulation, particularly for children.

Evidence

Examples of financial transactions becoming public on social media and the emergence of NFTs and digital land purchases in the metaverse.

Major Discussion Point

Future Developments and Challenges

Agreed with

Aki Enkenberg

Agreed on

Convergence of technologies creating new challenges

A

Aki Enkenberg

Speech speed

137 words per minute

Speech length

1770 words

Speech time

770 seconds

Neurotechnology benefits for health and education

Explanation

Neurotechnology offers potential benefits in health and education sectors. It can be used to monitor learning or behavior in classrooms and stimulate learning.

Major Discussion Point

Benefits and Risks of Emerging Technologies for Children

Risk of unconscious influencing through neurotech

Explanation

Neurotechnology presents risks of unconscious influencing for political or commercial purposes. This is particularly concerning for children whose brains are still evolving.

Evidence

The EU AI Act’s recognition of this danger and its focus on governing specific uses of technology rather than prohibiting technologies per se.

Major Discussion Point

Benefits and Risks of Emerging Technologies for Children

Convergence of different technology domains

Explanation

There is an increasing convergence of different technology domains, such as EdTech, FinTech, and NeuroTech. This convergence makes it difficult to predict future developments and creates new challenges for regulation.

Major Discussion Point

Future Developments and Challenges

Agreed with

Melvin Breton

Agreed on

Convergence of technologies creating new challenges

J

Jasmina Byrne

Speech speed

136 words per minute

Speech length

1180 words

Speech time

519 seconds

Privacy and security risks of data collection

Explanation

The collection and processing of children’s data through emerging technologies pose risks to privacy and security. These risks need to be balanced against the potential benefits of these technologies.

Major Discussion Point

Benefits and Risks of Emerging Technologies for Children

Global fragmentation of regulation as a challenge

Explanation

The global fragmentation of regulation poses a significant challenge for ensuring consistent safety standards for children. This fragmentation can lead to uneven protection and potentially undermine trust in these technologies.

Major Discussion Point

Regulatory Frameworks and Gaps

Need to shape technology development with child rights in mind

Explanation

There is a need to shape the future development of technology with children’s rights in mind. This involves integrating child rights principles into the development of technologies and future regulatory approaches.

Evidence

Mention of the opportunity under the digital compact to integrate more effort in relation to children’s data governance.

Major Discussion Point

Future Developments and Challenges

S

Steven Vosloo

Speech speed

0 words per minute

Speech length

0 words

Speech time

1 seconds

Existing laws may not fully cover new technologies

Explanation

Current laws and regulations may not provide sufficient coverage for emerging technologies like neurotechnology. Some countries are investigating whether existing frameworks are adequate, while others are introducing new laws.

Evidence

Examples of investigations by the UK ICO and Australian Human Rights Commission, and new laws in Chile and Brazil specifically addressing neurodata.

Major Discussion Point

Regulatory Frameworks and Gaps

Differed with

Emma Day

Differed on

Approach to regulation of emerging technologies

Need for policy mapping to identify regulatory gaps

Explanation

Countries should conduct policy mapping exercises to identify gaps in their regulatory frameworks regarding emerging technologies. This would help determine if there is sufficient protection in place for children’s data.

Major Discussion Point

Regulatory Frameworks and Gaps

Potential divide between treated, enhanced and natural humans

Explanation

The advancement of neurotechnology could lead to a future divide between those treated with neurotechnology for disorders, those enhanced for better performance, and those who remain ‘natural’. This raises significant ethical and societal concerns.

Evidence

Quote from a participant from Zimbabwe during consultations on the future of neurotechnology.

Major Discussion Point

Future Developments and Challenges

J

Jutta Croll

Speech speed

149 words per minute

Speech length

150 words

Speech time

60 seconds

Role of procurement rules in ensuring standards

Explanation

Procurement rules can play a crucial role in ensuring standards for child safety and rights in technology. Making child rights assessments or child safety assessments a precondition for procurement could be an effective approach.

Evidence

Reference to Section 508 in US law, which made accessibility a precondition for procurement 20 years ago.

Major Discussion Point

Data Governance Models and Implementation

S

Sabine Witting

Speech speed

176 words per minute

Speech length

2497 words

Speech time

847 seconds

Need for multi-stakeholder governance approaches

Explanation

Data governance for emerging technologies requires involvement from multiple stakeholders. This is particularly important for complex issues surrounding children’s data in new technological domains.

Evidence

Reference to the Global Digital Compact encouraging multi-stakeholder governance models.

Major Discussion Point

Data Governance Models and Implementation

Agreed with

Emma Day

Agreed on

Need for multi-stakeholder governance approaches

Agreements

Agreement Points

Need for multi-stakeholder governance approaches

Emma Day

Sabine Witting

Need for multi-stakeholder governance approaches

Need for multi-stakeholder governance approaches

Both speakers emphasized the importance of involving multiple stakeholders in data governance for emerging technologies, particularly for complex issues surrounding children’s data.

Importance of regulatory sandboxes

Emma Day

Melvin Breton

Importance of regulatory sandboxes for innovation

Need for collaboration between financial and data regulators

Both speakers highlighted the value of regulatory sandboxes in fostering innovation while ensuring compliance with regulations, particularly in the context of emerging technologies.

Convergence of technologies creating new challenges

Aki Enkenberg

Melvin Breton

Convergence of different technology domains

Expansion of FinTech into new domains like gaming and metaverse

Both speakers noted that the convergence of different technology domains creates new challenges for regulation and prediction of future developments.

Similar Viewpoints

Both speakers highlighted challenges in implementing and enforcing regulations, with Emma focusing on implementation gaps at the school level and Jasmina emphasizing the global fragmentation of regulation.

Emma Day

Jasmina Byrne

Implementation gaps in applying existing regulations

Global fragmentation of regulation as a challenge

Both speakers emphasized the importance of proactively addressing regulatory challenges, with Steven suggesting policy mapping exercises and Jasmina advocating for integrating child rights principles into technology development.

Steven Vosloo

Jasmina Byrne

Need for policy mapping to identify regulatory gaps

Need to shape technology development with child rights in mind

Unexpected Consensus

Importance of procurement rules in ensuring standards

Jutta Croll

Emma Day

Role of procurement rules in ensuring standards

Implementation gaps in applying existing regulations

While not explicitly stated by Emma, her discussion of implementation challenges aligns with Jutta’s suggestion of using procurement rules to ensure standards. This unexpected consensus highlights a practical approach to addressing implementation gaps.

Overall Assessment

Summary

The speakers generally agreed on the need for multi-stakeholder approaches, the importance of regulatory innovation (such as sandboxes), and the challenges posed by the convergence of technologies. There was also consensus on the need to address implementation gaps and shape future technology development with children’s rights in mind.

Consensus level

Moderate to high consensus on key issues, with speakers often approaching similar concerns from different angles. This level of agreement suggests a shared understanding of the complex challenges in data governance for children in emerging technologies, which could facilitate more coordinated efforts in addressing these issues.

Differences

Different Viewpoints

Approach to regulation of emerging technologies

Emma Day

Steven Vosloo

Implementation gaps in applying existing regulations

Existing laws may not fully cover new technologies

Emma Day argues that the main challenge in edtech regulation is implementation of existing regulations, while Steven Vosloo suggests that current laws may not provide sufficient coverage for emerging technologies like neurotechnology.

Unexpected Differences

Overall Assessment

summary

The main areas of disagreement revolve around the adequacy of existing regulatory frameworks and the specific approaches to governance for emerging technologies.

difference_level

The level of disagreement among the speakers is relatively low. Most speakers agree on the importance of addressing data governance for children in emerging technologies, but have slightly different perspectives on how to approach regulation and implementation. These differences do not significantly impede the overall discussion on improving data governance for children, but rather highlight the complexity of the issue and the need for comprehensive, multi-faceted solutions.

Partial Agreements

Partial Agreements

Both speakers agree on the need for collaboration in governance, but Emma Day emphasizes a broader multi-stakeholder approach including civil society, while Melvin Breton focuses specifically on collaboration between financial and data regulators.

Emma Day

Melvin Breton

Need for multi-stakeholder governance approaches

Need for collaboration between financial and data regulators

Similar Viewpoints

Both speakers highlighted challenges in implementing and enforcing regulations, with Emma focusing on implementation gaps at the school level and Jasmina emphasizing the global fragmentation of regulation.

Emma Day

Jasmina Byrne

Implementation gaps in applying existing regulations

Global fragmentation of regulation as a challenge

Both speakers emphasized the importance of proactively addressing regulatory challenges, with Steven suggesting policy mapping exercises and Jasmina advocating for integrating child rights principles into technology development.

Steven Vosloo

Jasmina Byrne

Need for policy mapping to identify regulatory gaps

Need to shape technology development with child rights in mind

Takeaways

Key Takeaways

Emerging technologies like EdTech, FinTech and Neurotech offer both benefits and risks for children’s data governance

Multi-stakeholder governance models are needed to address the complex challenges of regulating these technologies

There are gaps in existing regulatory frameworks to fully address new and converging technologies

Implementation of existing regulations is a major challenge, especially in resource-constrained settings

Future developments will likely see further convergence of technologies and expansion into new domains, requiring adaptive governance approaches

A holistic child rights approach is important when shaping future technology development and regulation

Resolutions and Action Items

UNICEF to publish a collection of case studies on innovations in data governance for children next year

Recommendation for countries to conduct policy mapping exercises to identify regulatory gaps for neurotechnology

Unresolved Issues

How to effectively regulate converging technologies that cross traditional regulatory boundaries

How to address the global fragmentation of regulation and create more uniform safety standards

How to balance innovation with protection in future regulatory approaches

How to incorporate child rights principles into the development of new technologies

How to address the digital divide and ensure equitable access to benefits of new technologies

Suggested Compromises

Use of regulatory sandboxes to allow innovation while exploring appropriate governance models

Development of certification schemes as a way to outsource some regulatory oversight

Incorporation of child rights assessments into procurement processes for new technologies

Thought Provoking Comments

EdTech, FinTech and Neurotech. My name is Sabine Witzing. I’m an assistant professor for law and digital technologies at Leiden University and the co-founder of TechLegality together with my colleague here, Emma Day. And we are joined today by a variety of speakers both online and offline.

speaker

Sabine Witting

reason

This opening comment sets the stage for the entire discussion by introducing the three key technology domains that will be explored: EdTech, FinTech, and Neurotech. It establishes the interdisciplinary nature of the panel and the focus on legal and technological aspects.

impact

This framing shaped the entire flow of the discussion, providing a structure for exploring data governance issues across these three domains throughout the session.

So we have been working with about 40 experts around the world to understand better how these frontier technologies impact children, and particularly how data used through these technologies can benefit children, but also if it can cause any risks and harm to children.

speaker

Jasmina Byrne

reason

This comment highlights the global, collaborative nature of the research being discussed and frames the key tension between benefits and risks of these technologies for children.

impact

It set up the discussion to explore both positive and negative impacts, leading to a more balanced and nuanced conversation throughout.

I think there’s still a lack of clarity around exactly what data would be helpful. What are the questions that we’re seeking to answer with these data?

speaker

Emma Day

reason

This comment cuts to a core issue in data governance – the need to clearly define the purpose and value of data collection, especially for children.

impact

It shifted the conversation from general benefits to more specific considerations about data utility and necessity, encouraging more critical thinking about data practices.

You can think about, in the application with FinTech, ways in the most obvious way in which it benefits children is in enhancing financial literacy from a young age, right?

speaker

Melvin Breton

reason

This comment introduces a concrete benefit of FinTech for children that may not have been immediately obvious, broadening the scope of the discussion.

impact

It opened up exploration of specific use cases and benefits of FinTech for children, leading to a more detailed discussion of both opportunities and risks in this domain.

We’ve long recognized that children and youth do need to be considered through specific perspectives in relation to digital technologies, AI and data.

speaker

Aki Enkenberg

reason

This comment emphasizes the importance of child-specific considerations in technology governance, highlighting Finland’s proactive approach.

impact

It shifted the discussion towards more child-centric policy approaches and the need for tailored governance frameworks.

Neurotechnology is not advancing in a regulatory void or vacuum. We have existing regulations, existing laws, including the Convention on the Rights of the Child. The question is, do they apply to this frontier technology?

speaker

Steven Vosloo

reason

This comment raises a crucial question about the applicability of existing legal frameworks to emerging technologies.

impact

It prompted a deeper exploration of regulatory gaps and the need for adaptive governance approaches for frontier technologies.

I think it’s less about gaps, actually, and more about implementation. And so I think you can have gaps in putting the frameworks in place, but definitely maybe even a bigger gap in terms of implementing the regulations we already have.

speaker

Emma Day

reason

This insight shifts focus from creating new regulations to the challenges of implementing existing ones, especially in the education sector.

impact

It led to a discussion about practical challenges in governance, such as procurement rules and guidance for schools, rather than just focusing on regulatory frameworks.

So I would just like to remind everyone again that children’s rights are comprehensive, but also they need to be looked at both from the positive and protection side. And when we think about the future of tech and future of technology, that holistic child rights approach, I think is the best way forward.

speaker

Jasmina Byrne

reason

This concluding comment brings the discussion full circle, emphasizing the need for a holistic, rights-based approach to technology governance for children.

impact

It provided a unifying framework for the diverse topics discussed and reinforced the importance of balancing innovation with protection in future governance approaches.

Overall Assessment

These key comments shaped the discussion by progressively deepening the analysis of data governance issues for children across EdTech, FinTech, and Neurotech domains. They moved the conversation from general benefits and risks to specific implementation challenges, regulatory gaps, and the need for child-centric, rights-based approaches. The comments highlighted the complexity of governing frontier technologies, emphasizing the importance of multi-stakeholder collaboration, practical implementation strategies, and the need to balance innovation with protection. Throughout the discussion, there was a consistent focus on the unique considerations required for children’s data, which culminated in a call for a holistic, rights-based approach to technology governance for children.

Follow-up Questions

How can regulatory sandboxes be adapted to include civil society and children’s participation?

speaker

Emma Day

explanation

This is important to ensure a more comprehensive multi-stakeholder approach in developing and regulating new technologies affecting children.

What are some examples of successful cross-border regulatory sandboxes?

speaker

Emma Day

explanation

This could provide insights into how to regulate multinational edtech companies more effectively across different jurisdictions.

How can data protection authorities be empowered to provide more tailored regulations for specific tech domains like fintech, edtech, and neurotech?

speaker

Melvin Breton

explanation

This could lead to more effective and specific data governance regulations for children across different technology sectors.

What are the best practices for implementing existing data protection regulations, particularly in the education sector?

speaker

Emma Day

explanation

This is crucial for addressing the gap between existing regulations and their practical implementation in schools.

How can we address the equity issues arising from the potential use of neurotechnology for cognitive enhancement?

speaker

Steven Vosloo

explanation

This is important to prevent widening global inequalities in education and cognitive performance.

How can we integrate child rights principles into the development of new technologies?

speaker

Jasmina Byrne

explanation

This is crucial for shaping future technologies in a way that respects and promotes children’s rights from the outset.

What approaches can be developed to balance innovation and child protection in future regulation strategies?

speaker

Jasmina Byrne

explanation

This is important to ensure that future regulations allow for technological innovation while safeguarding children’s rights and safety.

Disclaimer: This is not an official record of the session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed.