Open Forum #72 European Parliament Delegation to the IGF & the Youth IGF
Open Forum #72 European Parliament Delegation to the IGF & the Youth IGF
Session at a Glance
Summary
This discussion focused on the protection of minors online and the challenges of balancing regulation with digital rights and innovation. The panel, comprising European Parliament members, European Commission representatives, and youth leaders, explored various legislative efforts and their global impact.
Key topics included the EU’s Digital Services Act (DSA), which mandates online platforms to assess and mitigate risks to minors, and the proposed regulation on combating child sexual abuse online. The discussion highlighted the importance of age verification measures and the need for effective enforcement of existing regulations.
Participants debated the merits of self-regulation versus legislative approaches, with some advocating for mandatory rules and others emphasizing the importance of preserving innovation and freedom of expression. The potential global impact of EU regulations, known as the “Brussels effect,” was noted, along with the need for international cooperation and consultation.
Youth representatives stressed the importance of including young people’s perspectives in policy-making, given their unique experiences as digital natives. They also highlighted the positive aspects of internet access for education and economic opportunities, particularly in the Global South.
The discussion touched on emerging challenges, such as the use of AI in content moderation and the need for human oversight. Participants also explored the potential for new tools like AI blockers to empower users in managing their online experiences.
Overall, the panel emphasized the need for a balanced approach that protects minors while preserving the benefits of the digital economy. The discussion concluded with a call for ongoing dialogue and global cooperation in addressing these complex issues.
Keypoints
Major discussion points:
– Balancing protection of minors online with digital rights and innovation
– Age verification and content moderation on social media platforms
– Addressing issues like cyberbullying, exposure to inappropriate content, and mental health impacts
– The role of AI and technology in content moderation and age verification
– Incorporating youth perspectives in developing internet regulations
The overall purpose of the discussion was to explore approaches to regulating the internet and social media to protect minors, while still preserving digital rights and innovation. It aimed to gather perspectives from EU policymakers, youth representatives, and other stakeholders on existing and proposed regulations.
The tone of the discussion was generally collaborative and solution-oriented. There was an emphasis on hearing diverse viewpoints, especially from youth. The tone became more urgent when discussing the need for global cooperation and youth involvement in policymaking. Overall, participants maintained a constructive approach while acknowledging the complexity of the issues.
Speakers
– Yuliya Morenets: Moderator
– Tsvetelina Penkova: Head of the European Parliament delegation, S&D group
– Ivars Ijabs: Member of European Parliament, ITRE Committee
– Eszter Lakos: Member of European Parliament, ITRE and AFET Committees
– Fulvio Martuscielo: Member of European Parliament, ECON and ITRE Committees, Head of Italian delegation in EPP group
– Pearse O’donohue: Director for Future Networks, European Commission
Additional speakers:
– Brando Benifei: Member of European Parliament (mentioned but not in speakers list)
– Vlad Ivanets: Internet Society Youth Ambassador
– Fatou Sar: Youth IGF Ambassador, MAG member of Africa IGF
– Dana Kramer: Internet Society Youth Ambassador, Coordinator of Youth IGF Canada
– Peter Kinkway: Representative for Youth IGF Liberia
– Chris Junior: Audience member from Zimbabwe
– Wouter: Audience member from Netherlands
Full session report
Expanded Summary of Discussion on Protecting Minors Online
This comprehensive discussion brought together European Parliament members, European Commission representatives, and youth leaders to explore the complex challenges of protecting minors online while balancing digital rights and innovation. The panel delved into various legislative efforts, their global impact, and the crucial role of youth perspectives in shaping internet governance.
Key Regulatory Frameworks and Approaches
The EU’s Digital Services Act (DSA) emerged as a central topic, with Tsvetelina Penkova, head of the delegation from S&D in the European Parliament, and Pearse O’Donohue highlighting its mandates for online platforms to assess and mitigate risks to minors. O’Donohue emphasized that the DSA requires platforms to implement age verification measures, which are seen as critical for protecting youth from inappropriate content. He also noted ongoing investigations into platforms like TikTok, Facebook, and Instagram regarding age verification and potentially addictive features.
Penkova mentioned the proposed EU regulation on combating child sexual abuse online, further demonstrating the EU’s commitment to addressing online harms to youth. Brando Benifei discussed the AI Act and its role in combating cyberbullying, particularly regarding deepfakes and transparency measures.
However, Ivars Ijabs cautioned that regulation should balance protection with digital rights and innovation, reflecting a nuanced approach to legislative solutions. Vlad Ivanets echoed this sentiment, warning about the risks of over-regulation and emphasizing the importance of self-regulation by platforms.
Dana Kramer provided insights into Canada’s Online Harms Act, stressing the importance of intersectionality in understanding and addressing online harms. This perspective highlighted the need for comprehensive approaches that consider diverse experiences and vulnerabilities.
Global Impact and Regional Considerations
The potential global impact of EU regulations, known as the “Brussels effect”, was a significant point of discussion. Speakers noted that EU policies could influence internet governance worldwide, raising questions about the implications for digital economies in other regions.
Peter King emphasized the need for youth-centric internet regulation in Africa and the broader global context. Chris Junior from Zimbabwe provided a poignant example of the internet’s positive impact, noting that during the COVID-19 pandemic, much of their education relied on platforms like WhatsApp. This highlighted the importance of internet access for education and income generation in the Global South, underscoring the need to consider diverse global contexts when crafting regulations.
Technological Solutions and Challenges
The role of artificial intelligence in content moderation emerged as a contentious topic. While some speakers highlighted the potential of AI to combat issues like cyberbullying, audience members emphasized the continued need for human review in moderating potentially harmful content. This debate underscored the complex balance between leveraging technological solutions and ensuring fair and effective content moderation practices.
Dana Kramer proposed the development of AI blockers similar to ad blockers, allowing users more control over their online experiences. This idea sparked discussion about novel technological approaches that could empower users while addressing online harms.
Youth Perspectives and Inclusion
A recurring theme throughout the discussion was the critical importance of incorporating youth perspectives in policy-making. Youth representatives stressed that as digital natives, they offer unique insights into the realities of online experiences. This led to challenging questions about whether policymakers are sufficiently equipped to represent youth perspectives in the legislative process.
Wouter, an audience member from the Netherlands, directly questioned the panel on their readiness to incorporate youth viewpoints, highlighting a potential gap in the policy development process. This prompted reflection on the need for more direct engagement between young people, policymakers, and tech industry leaders in shaping internet governance.
Balancing Protection and Opportunity
A key challenge articulated by Pearse O’Donohue was the need to protect children from age-inappropriate content without excluding them from the positive opportunities the internet offers. This delicate balance framed much of the discussion, with participants grappling with how to create safe online environments that still allow for learning, creativity, and digital skill development.
Unresolved Issues and Future Directions
While the discussion made significant strides in exploring the complexities of protecting minors online, several issues remained unresolved. These included finding the right balance between protection and freedom of expression, ensuring global coordination on internet governance, addressing the reliance on internet platforms for education and income generation in developing countries, and determining the appropriate role of AI in content moderation.
Tsvetelina Penkova concluded with a warning about the potential shrinking of the digital economy without a common approach to regulation, emphasizing the need for ongoing dialogue and global cooperation in addressing these complex issues. Suggested compromises included developing user-controlled AI blockers, combining voluntary platform actions with mandatory regulations for sensitive areas, and creating more opportunities for direct dialogue between youth, policymakers, and tech company leaders on internet governance issues.
In summary, this discussion highlighted the multifaceted challenges of protecting minors online in a globally connected digital landscape. It emphasized the need for nuanced, inclusive approaches that consider diverse perspectives, leverage technological innovations, and balance protection with the preservation of digital rights and opportunities for youth worldwide.
Session Transcript
Yuliya Morenets: All right, yes, so obviously with the European Union Digital Services Act and the priorities outlined in Executive Vice President Hannah Birken and mission letter dedicated to advancing digital and governance policies, the discussion takes on critical importance. So drawing on insights from recent legislative developments in Australia and ongoing discussion in Canada, our idea today is to discuss if we need a different regulation and that if the existing regulation can address issues like cyberbullying, mental health impacts and platform accountability. So once again, I will be moderating this session and we have a number of guests present in the room with whom we will be discussing, members of the European Parliament delegation that we would like to thank for accepting this invitation for being with us today. We have Svetlana Penkova, head of the delegation from S&D. I don’t know if you would like just to, yes. We have Esther Lakos, I hope so, in the room, Fulvio Martucello, Silvia Sardone, Dominic Tarsinski, Tobias Baschenski, Ivar Yarps and we have Brenda Benefe that I hope is present together with us today, this afternoon. We have, of course, with us, Piers O’Donoghue, director for the Future Networks Directorate of TG Connect. Together with the European Parliament delegation that we thank to be present today, we have a number of young leaders that will be helping us to understand the three cases we prepared for you today for the discussion. We’re supposed to have Denia Psarou online from the Greece IGF. By the way, we have a solid number of people present online and following this discussion. We have Vlad Ivanis, apologies, from the youth community present on site, Peter Kinkway from the Liberia Youth IGF, Fatou Sar from the youth community, Levi Sianseki helping us from the youth IGF Zambia, and a number of other young people that we hope will participate in the discussion. So as we know, Australia passed the world’s first law banning under 16 years old from being present on social media. We know that at the beginning of this year, Canada introduced the Online Harms Act with enhanced protection on social media services. Probably the very first question we would like to bring to the members of the European Parliament, but also to the youth community, we do know that EU is quite often seen as a kind of Silicon Valley for regulation, right? Will the European Union follow the example of Australia, or will take another route and lead in this area of protecting kids and children online, but at the same time balancing with digital rights? I would like to give the floor to the head of the delegation from the European Parliament. The floor is yours. Perfect, thank you. I hope you can all hear me.
Tsvetelina Penkova: Thank you, thank you, Julia, for the short introduction. We are having a packed room here in Riyadh, and there are a lot of topics that we are going to be happy to discuss, but we also want to hear the young people’s perspective that we have here in the room. Around me, I have my colleagues from the European Parliament, and I’ll give them the floor to introduce themselves briefly, because we’re all coming from different political groups, and we’re coming from different member states, and of course, the national and the political perspective does matter when we’re having those debates, because as you know, a lot of the legislations that we are passing, they are on a consensus basis at the European Parliament, but still, it’s important to hear who we are exactly. And also we have with us the European Commission. I mean, if you say that the Silicon Valley of regulation is the European Union, probably the rules are starting from the European Commission, so that’s why they would also have an active role in this debate. So before we jump into the topic, because you’ve already very specifically presented that, I would just ask my colleagues to present themselves. And also a remark, because I know that you’re referring specifically to Brando, he’s joining us in a bit, he’s just participating in another debate on AI, because AI seems to be one of the main and the key topics of this IGF, but as the main expert in the EU at the moment, he’s going to join us shortly. So don’t be afraid that you’re going to miss the insight from the originator of the legislation. And now I’m going to start from my left with introducing my colleagues from the European Parliament.
Ivars Ijabs: Thank you very much, Tvetelina. My name is Ivar Sijaps, I’m a second mandate MEP. I work with energy, industry, technology in the EITRE Committee in the European Parliament. I’m looking forward to have a fruitful debate with you tonight.
Yuliya Morenets: Thank you.
Eszter Lakos: Nice to meet you, my name is Eszter Lakos, I’m Hungarian, first term MEP, and as two of them I’m a member of the ITRE, so Industry, Research, Energy, etc., and also AFET which deals with foreign policy.
Fulvio Martuscielo: I am Fulvio Martuscello from Italy, I am a member of the European Parliament from 2014, now I am a full member in ECON committee and ITRE committee, and I am also the head of the Italian delegation in FPP group.
Pearse O’donohue: Good afternoon, I am not a member of the European Parliament, I am an official of the European Commission, my name is Pierre Sadanahu, Director for Future Networks, and my work brings me into direct involvement in the next generation Internet, including particularly in this case the governance of the Internet, which is why I am here at the IGF, thank you.
Yuliya Morenets: Perfect, I hope this mic is working, can you hear me, I see the nodding in the room, perfect, so we can use the other one for the rest of the people here, do you’d like to say a few words about yourself and introduce yourselves please?
Speaker 1: Sure, hello everyone, good evening, thank you for being here, for joining us on site as well as online, my name is Vlad Ivanets, I am this year Internet Society Youth Ambassador, but during this session I hope I will be able to present my personal opinion on the legislation that affects children and youth population, and I will pass the mic to my colleagues.
Speaker 2: Good evening everyone, I am happy to be there with all of you, my name is Fatousa, I am a current Youth IGF Ambassador, and I am an engineer in Green Hydrogen and Energy, and one of the MAG members of the Africa IGF. Thank you.
Speaker 3: Hello, my name is Dana Kramer. I am an Internet Society Youth Ambassador, and I’m also the coordinator of Youth IGF Canada. And in Canada, and our Youth IGF in particular, we’ve been collaborating a lot on the Online Harms Act with our parliamentarians, and that was actually our keynote speech within our Canada Youth IGF in September. And I’ve been very fortunate to have lots of communication on that specific type of legislation.
Speaker 4: Okay, thank you so much. My name is Peter Kinkway. I’m the representative for the Youth IGF in Liberia. Basically, we’ve been involved in the space of youth, I mean, IGF space in the Mano River Union, which make up all countries, Liberia, Sierra Leone, Guinea, and Africa.
Tsvetelina Penkova: Perfect. Thank you so much. It was important to know who are we speaking with. So the question that was posed initially by Julia, I will start making some initial remarks, and then I’ll ask my colleagues also from the parliament and from the commission to jump in at any point. And of course, after you’ve heard the legislator’s stand on all those matters, we would like to hear your feedback, or how do you see those topics developing or evolving, or what do you expect more from us? Because I know that you are the people who would have the most impactful and important insights of what is working and what is not. So as you know, in a lot of those digital policies and regulations we are working on, one of the main challenges we are facing at the European Parliament and the European Commission is basically to balance the protection with the digital rights, because we want to foster innovation, but at the same time, we don’t want to limit it with too many restrictions. So as you know, in a lot of the topics, this is probably gonna be one of the most challenging aspects of how to do this and how to achieve it. So in her speech in July this year, The President of the European Commission, Madame von der Leyen, she did underline as one of the main priorities that we need to work more in order to tackle social media addiction and cyber bullying. So in a lot of the legislations we’ve been doing in the last five years and in the ones upcoming for this new mandate of the European Parliament and the European Commission, we’re going to try to take into account this premise. I’m briefly going to give an idea about three specific legislations and examples that are either already finalized or we’re in the process of finalizing them to set the ground for some specific conversations here with the audience and with all of you. So the first one is the EU proposal on combating child sexual abuse and exploitation. So this was proposed by the European Commission as a regulation two years ago, like already two and a half in the spring of 2022. So it does provide some very specific proposals and mandates when it comes to online, to detection of online sexual abuse related to minors and children. So at the moment, the stand of this regulation is currently under the discussion between the European Parliament and the Council. So we still have a lot to say on that matter. But as I said, some of the subjects are very sensitive. So that’s why they’re taking a while. So this is the first one we put on the floor. Brando has just joined us. Everyone was expecting you with relation to the AI, which is probably going to be another topic we’re going to discuss later on. The next the next file, which I’m going to emphasize is the Digital Services Act. I’m sure that quite a lot of you have. Heard about it like as Europe being the Silicon Valley of regulations So in in that’s in that one. We have quite specific obligations on the online platforms to To respect some of the users fundamental rights I’m not gonna list all of them, but just a few of them like the the right of freedom of expression Expression it is there of course, but we have the right of protection for children the right not to face discrimination the right to protection of personal data And the best interests of children principle so as you see we are trying in a regulatory Framework, which is restrictive to a certain extent to still protect the rights of all the users with a special Accent like with a special emphasis on minor and on minors and children and Last but not least the ADA’s regulation, which is the age verification without data disclosure This is again very much targeted for to avoid any violation of the rights of children and minors So we are trying to to touch upon every aspect that could that could help us prevent from harmful behavior Against people who might not be that well informed I will stop here now because as I said if we go into the depth of all those regulations We might take too much time And I’ll ask if any of my colleagues including the European Commission if you want to jump in
Pearse O’donohue: Wow, thank you Apart from the protocol I swore I wasn’t going to speak first because I know that the members of Parliament have a very strong view on this and By the way, I don’t intend to speak with regard to the child sexual abuse proposal online, for the reason that it is, as you’ve said now, in the hands of the co-legislator. This is for parliamentarians to discuss with the Council and the Commission takes a backseat role at that stage. But thank you for your very quick run-through of some of the key issues. I would like to focus a little bit on the Digital Services Act, because particularly as we’re here in a global environment, we do like to learn from others, and we particularly want to hear from the youth community today, but also hopefully we can have some experiences and examples that will help other regions to address this specific issue of the protection of minors, but also creating an environment in which younger people can effectively operate online in the environment which will be their environment after I’m long gone. And the protection of minors is, of course, a key enforcement priority for the Commission, for the European Parliament, and of course for the… It doesn’t work. Yeah, sorry, I’ve been in this room a few times. I will cut out regularly, so we’ll just keep going. But with that consensus, however, there was quite a discussion about how to do things. Now, already, the Chair has given you a quick list of what the DSA does in terms of ensuring a high level of privacy, including the bans on what platforms cannot do, the bans on what they should not do, as well as responsibilities with regard to flagging illegal content and banning dark patterns. But we also have created a situation in which online platforms and search engines, they have to take responsibility. So they have to assess systemic risks that arise from the design and the use of their services. And that’s increasingly what we will see across all of the… a range of new technologies, but also new platform services as they come online. And that is a way of not over-regulating through principles which are agreed and in many cases corrected and drafted by the European Parliament, which if they are not respected, then there is enforcement action, then there is a mandatory action that steps in. But in this case in particular, they have to look carefully at the features and the use of the services with risks that affect children, the physical and the mental well-being of the users. And that includes foreseeable negative effects, not just those that may already have occurred. So it has to be forward-looking and again, looking at the mental as well as the physical well-being of this particularly vulnerable group, which we have a duty to protect. And when those risks are identified, the platforms have to put in effective mitigation measures. So we’ve already started implementation. I’ve got a long list, it’ll take me a quarter of an hour to read the entire list of all of those measures, but no, I won’t. But we have started for investigations, which once they reach a certain maturity, are then published. The early stages are not, because of the secrecy of instruction. But it is our intention under the DSA to move as quickly as possible to move to the publication and the information, the transparency of those proceedings, because in some cases, it’s in the interest of the platforms if they can show that they have rapidly addressed a problem, but also to the community at large, this is a function or a feature that has been identified as just not acceptable. And therefore, this is part of their forward-looking work. If one of your competitors has been told to stop a practice, well, then you should be sure that that same practice is not acceptable on your platform. So that’s another way of reinforcing it. We have, for example, opened cases against TikTok, two cases, one against Facebook, one against Instagram. We’ve expressed our doubts about the way that the platforms assure themselves of the age of their users. So age verification, and I’ll come back to that in a moment, is essential. And we’ve even gone to further proceedings against TikTok, for example, on what was called or is called the Light Rewards Program, which was, in our view, something that could aggravate the addictive character of their service. And TikTok is actually committed to permanently withdrawing the program from the EU, commitments that we have made legally binding. Now here, I just opened a parenthesis, and I’d love to hear the members of Parliament, but also our Youth Forum representatives. Is it acceptable that it’s only stopped in the European Union? Is there a different threshold for the protection of children in other regions? I wouldn’t think so. We don’t want to impose solutions on anyone. This is where cooperation, exchange, and learning lessons from one another is very important. Maybe I’ll stop there. But there is another area that I won’t touch specifically on the CSAM. But of course, there are also provisions with regard to age-inappropriate content, in particular pornographic content, which can have a very significant effect. Of course, here, I’m not talking about pedo-pornography, which is quite simply illegal. There is no question. What I’m talking about here is the facility with which any user, without the proper safeguards, can access what is considered to be legal, which is perhaps available to, and who knows, appropriate for adults. But it is certainly, in our view, not appropriate for minors and children. And that is, therefore, another element which is addressed in the DSA. So I’ll just stop here. We’re moving on to guidelines on the protection of minors under Article 28 of the DSA. One of the many areas in which we introduce these guidelines, and where, of course, we will have reviews, we will no doubt have discussions with the European Parliament. If they’re not working, we will then have to move on to stronger measures. I’ll stop there. Sorry if I’ve been too long.
Yuliya Morenets: No, perfect. Thank you. Thank you, Piers, a lot. In terms of outlining a few more very specific parts of the legislations or what is in the pipeline, I would be also curious to go back to our youth panel and hear what they think about the application and what we have already in place. Is it understandable? Is it reachable? Does it come to you? But before that, before we move there, I’m sure Brando is going to put, did you want to take the floor?
Speaker 5: Yeah, I can add one topic on the floor. I don’t want to take too much time. In fact, I arrived late because I was in another seminar talking about the same topic. So it’s very much under the spotlight, this discussion on protection of minors, empowerment of children’s presence in the digital space. So just one thing on the AI Act, because it was already touched, the protection of minors, how it interacts with the Digital Services Act, the child sexual abuse material legislation, et cetera. But I want to highlight one point, which I want to be sure we put it on the table, which is the fight against the cyberbullying, which is crucial. And that’s where the AI Act can give a further support, because it’s for sure we use the Digital Services Act to act against materials that can provoke instances of cyberbullying. But we also have the issue of material that can be in a more difficult way identified as offensive or violent, which can anyway provoke mental health issues and cyberbullying in a more subtle way, by showing people doing things or saying things they could be ashamed of in a way that is very specific to that situation. And that’s difficult to catch with the existing norms. So that’s why I think it’s important to underline that the AI Act… which I want to underline was supported very broadly, I see political groups here, different political groups that supported our work on AI, because on this we have given more transparency that can be used to prevent cyberbullying. I give you the example and I stop. If you produce a deepfake that shows a person, a children, a minor, that is doing or saying things that can be mentally damaging for them, because they are not doing these things, they are not saying these things that they are shown, they are sending this material around, this can be dealt with with the Digital Services Act, but not necessarily, it depends. So it’s very important that we have basic transparency so that a deepfake of this kind, we disencourage, in fact, the generative AI systems to develop some kind of materials that can be offensive, but also we want that to be labeled with a so-called watermarking so that people can say, okay, this is fake, this is not real. And then it can be removed, then it can be treated. But we also say from the beginning, this is not real and it can be helpful to avoid forms of mental health problems, of cyberbullying, of offensive material in different ways. So this is another safeguard the AI Act puts in place that I think it’s important to underline in the context of the generative AI that has created new challenges that we need to tackle. Thank you. Hello, thank you.
Speaker 3: Dana Kramer for The Record. First off, I want to just say thank you for including youth in this discussion, especially from you because, and youth from different areas of the world outside of your jurisdiction, because with the Brussels effect that we know occurs in a digital area, it’s really important for us to be able to say, It’s a real power to have and requires global consultative efforts. And so I just want to extend that. Thanks. With that said, I think probably with some of these policies and putting on deliberating on them that they require immense levels of reflection about how they would impact the rest of the world. For instance, in talking about access to porn, pornographic material for minors in Canada this past year, we actually had a bill that came through that was proposed in our Senate to then move to different chambers that would limit adolescents from being able to access porn. However, in so doing, there were concerns that through that age verification, it would have privacy implications for understanding what an age for somebody was. As well, if certain business models from an infrastructural perspective were built in having cached content at internet exchanges or content deliveries, that that caching could then result in different businesses out of fear of regulation pulling out of our market. Netflix is a really good example of this. And so if, for example, for certain policies to take that on, if the Brussels effect hit us in Canada as an example, and we had to create different legislation for that, we could legitimately see our digital economy shrink because of poor implementation for that. And so this global consultation effort I think is really important for ensuring that such issues do not arise for third countries just in an effort to have positive business and economic relations with Europe. I also want to touch on the issue of child sexual exploitation online. And in Canada, of course, we’ve had the Online Harms Act that was mentioned by the opener. And we actually just in the past week have separated it now in two acts for political reasons. And that act was specifically designed to have a child safety element, and also to hate speech element, both of those because online harms to a four year consultative effort. So lots of engagement for it. We’re seeing that we needed to apply in Canada what we call a GBA plus analysis or gender based analysis and a plus was added to recognize groups that would have intersectionalities. So intersectionality is broadly understood that different social stratification levels, gender is one of these, racism is another, ableism can be another, gender, sexuality, if you’re LGBTQ plus, another for these, that in those intersections, it could cause it that a youth who might receive some type of exploitation online would then have that expanded so hate had to be included in that. So I wanted to address that as well, the importance of taking intersectionality approaches to understanding how a bill can, or any type of legislation can impact a young person, because for example, a young white man is going to have a very different experience than a black Muslim girl on the internet. Thank you.
Ivars Ijabs: Thank you very much. And this is really a pleasure to have you all here. Because I think this topic is really an extremely important one, first of all, dealing with youth and children as a possible identity group. I mean, there is also a problematic dimension of that because we are growing up and meaning that also the current generation which is growing up right now is of course digitally much more skillful, much more native. And we actually expect from the next generation that they are as digital natives much more skillful, and they are capable of doing things that my generation, I’m 52, I’m just not used to. And that’s why I think that we should look at this issue of regulation also from the perspective of learning. And this applies not just to the digital sphere, but also to the physical sphere, because there are strong parallels, because in all digital legislation, those things that are prohibited in the real life, like sexual exploitation, bullying and so on and so forth, they should be also banned or prohibited in the digital life. But if we look at that learning dimension, I think this is really an important thing to create safe learning environment for our younger generation and children, because of course we learn actually by making mistakes, but that’s why we prevent children from making very big mistakes. And that’s why in the real life, in physical life, we create safe learning environment for our children. And that’s why I think also when we are thinking about how to regulate the digital sphere, the possibility to learn must be there, and that’s why the EU always runs the risk to over-regulate things. And this is in many ways also hampering our digital development compared with some other regions. In that sense, I think to keep that learning dimension also safe for youngsters who will be much more advanced in digital sphere, because they are already the next generation when dealing with AI, when dealing with all the possible things like Internet of Things and so on and so forth. But at the very basis, we have to really solve those issues that have been already mentioned as the basic norms like cyberbullying and peer-to-peer violence should be avoided, exposure to hate speech and violent content, exposure to content inciting self-harm or suicide, as we all know, which is a big issue in many countries, and extremism, terrorism, things like that. But at the same time, we have to keep in mind that we expect from the next generations that they will be digitally much more skillful and advanced as we are. Thank you very much.
Yuliya Morenets: Bettelina, can I maybe, we have a lot of questions online. My apologies for the headset, actually, we were not expecting people to be required to be.
Tsvetelina Penkova: Sorry, Julia, I didn’t hear you quite well, I was just about to pass you the floor.
Yuliya Morenets: Yes, apologies for that. We have a question, well, a number of questions online, actually, we have a solid participation to be very honest, 30 people plus, around 30 people. So we have actually one of the questions that just came about the DSA, because that was a discussion right now. So about the age verification, that it’s hardly checked. And that’s Katrin Moresh that is bringing the question, but maybe she would like to take a floor online, just remotely and ask her question. Can we give the mic to online, allow Katrin to ask her question? Or, okay, so if it doesn’t work, the question, should the platforms be forced by DSA and that allows to have a mandatory initial age verification? So that’s, I think, open to all participants.
Pearse O’donohue: Thank you. And hearing the inputs before with regard to that, yes, I can go into a bit more detail. But I’ll try to be brief. Because some of the providers of pornographic content have been designated as very large online platforms. And so now we have begun an inquiry with them specifically on the measures that they take to assess diligently, but also to effectively mitigate the risks relating to protection of minors. And that obviously initially starts with age verification, because we’re talking in many cases about age inappropriate content. So we were… I’m particularly interested in the details on age verification and we have their responses and now we are looking to take effective enforcement action. We haven’t had the legal means to do so until now but we are coming to that point because we don’t want to exclude children from the positive opportunities of the internet but we do actually want to protect them from this age inappropriate. So specifically age verification is a critical component. Unless we have the powers under the DSA to impose it as a protection and now what we are doing is as well as insisting on the very large online platforms enforcing it we will ourselves come forward with the member states with a temporary solution which we will then finalize once the European Union Digital Identity Wallet is fully functioning. But in the meantime we will have a privacy preserving and interoperable solution to age verification. I’m not talking about principles or a piece of paper now. We’re talking about a functioning piece of software which it will be obligatory for them to use if they do not have similar, sorry, their own mechanisms of equal effectiveness. So that is the immediate way forward we see for age verification. As I said in the long run we have the European Union Digital Identity Wallet which will be a way of ensuring on the basis of approved independently certified systems that the person is the age that they say they are and need to have in order to access these platforms. Thank you.
Tsvetelina Penkova: Thank you. Thank you, Peter. Julia, I would turn back to you if you have any more questions online and then I would like to ask you to pass the floor for the audience here if they want to pose any questions to the panel.
Yuliya Morenets: I don’t think we can hear you, you’re on mute. Sorry. Yes. Thank you for that. I was saying you never know how it goes on site online, but it’s always a success. So we actually prepared three cases for you to discuss because that case came from the youth community. I think we’ll just take one and I would like to return the situation. You know, we spoke a lot about how to protect minors online. Now the situation is the following. Let’s see the case. And that’s about the balance in between the freedoms and the freedom of expression and the harm online. Right. We have the young blogger and he wrote to us, he’s reporting news daily on a social media platform is he’s regularly blocked actually for reporting news. He always tries to be as neutral as possible and always check the news that he’s reporting for misinformation. However, the platforms tend to block him regularly. So the question is, that’s actually the right and the obligation which is given to the platforms by the regulation. And at the same time, that infringe his freedom of expression. So I would like to turn, I know we have Vlad in the room. Vlad, what is your opinion? Vlad is from youth community. And then we open the floor to the members of the European Parliament. Yeah.
Speaker 1: Thank you. Well, overall, I would say that it is quite disturbing, you know, to have any kind of regulation of the child’s actions online, because I have some experience. I originally from Russia and, you know, all the restrictions against civil society, against and actually any member of the community starts with regulation in the children protection field, I would say. And it then finishes up with the restrictions against all the members of the society. And my question is here, like, why we no longer believe in the self-regulations that already exist on the platforms? Because many of them, like you brought this example of marking the messages that like AI created content, etc. On X it is already realized in the form of the community notes. So if there is disinformation on the platform, it can be marked already by the platform itself, or the community members can add some notes to this. So you will know that it is misleading information. But I think it is kind of a very easy decision just to prevent platforms from some sort of activities like to oblige them to follow specific rules. It is easy way of solving the problem, like just to restrict them from doing something. I think that there are many other approaches that can be used, for example, dialogue and influence of the platforms to behave in a more meaningful and respectful way, supporting youth in their activities and answering this question that was raised, of course, platforms should not prevent younger people from posting anything on the platform, if it is not violating the rules of the platform itself. If it does, then of course, the general rules should apply to these bloggers as well.
Speaker 3: If I could just add on to something like in terms of regulations for self-regulations at the Canadian IGF a few weeks ago for our NRI, there was actually a comment about what if we could have regulations for harmful content with AI, that platforms have to develop out AI blockers, similar to how you can have advertisement blockers. What if there could be an invention for an AI blocker? That we could then allow that personal capacity to be able to make decisions about what to view online and allow that freedom of expression, but that would need to be invented still. And I know that AI is still a very much like in the beginning phases, at least a generative approach of how fast it’s expanding. But I wanted to add on to that because there’s an excellent point that we had in our NRI that I think would be helpful to bring to international conversations too about AI blockers as a potential tool for regulating platforms, but allowing the personal self freedom of expression that users can have in this space.
Speaker 5: Can I make an example? However, because you mentioned the AI blockers. Yeah, that’s a good idea. I think we should work on that. Like we do with the AD blockers, but we anyway regulate advertising with rules. So we do not rely only on AD blockers, because there are forbidden advertising in Europe. Now we have regarding exactly paid advertising online reaching children or a certain political advertising. We have in fact the ongoing implementation of regulations on these. But I’m just to say that we can combine the dimensions in the sense that I don’t see the contraposition because you can have AI blockers. I like the idea. That’s an instrument of freedom. But with the AD blockers already existing, we do regulate the advertising space. So I think we can do both. And in fact, to just comment on what you were saying, because this was a lot of the debate also when working on the AI Act, should we let the platforms, I mean, do their best and encourage them, etc. But this is what is already happening. That’s good, that’s the ethical dimension of developing AI tools in the space but we do not want, at least this was the thought in Europe, that it’s all based on the goodwill of the owners of a few very powerful American or Chinese or some Europeans It’s now also companies that do that, because either it’s recommended and they can ignore it or it’s a law and we think that on some aspects we need the law, on others we live with the soft, because soft regulation I mean, you look at the AI Act, a lot of AI applications are almost non-regulated by the AI Act because the AI Act concentrates on high-risk applications and on transparency looking at the generative AI but a lot of AI almost has nothing to be applied on them by the AI Act only the general principles idea, the idea of an ethical approach to AI that in fact the legislation pushes through but when we deal with more sensitive areas, we do not want to wait for the CEOs of some big companies to be good we want them to be obliged to do things based on an evaluation, based on our democratic values so the issue you raised was at the center of the debate around AI Act, but also of Digital Services Act, etc and we think we need to balance that, because if you only go for the voluntary actions you really are also in an uneven space, where probably the most powerful ones can do some things also for reputations and others can have problems. But anyway, this is a contribution to the reflections. Just quickly, I don’t know whether I have time to just speak on, from us in Africa and us in the global context, for the record, my name is Peter King from Liberia, but I feel we as youth, or youth around the world, will be looking at youth-centric Internet regulation. That means, or that looks at issues that is youth-friendly, because tomorrow is the youth that will lead the issue of Internet governance, the issue of cyber security, data protection. A very fantastic example is to also ensure that there is a balance in regulation and innovation. That speaks to the fact that some regulation should consider the level of regulation that does not harm the youth. Because the youth is not just on the list of multi-stakeholders, it’s a stakeholder in the process. So we will want to see regulation that are youth-centric, ensuring a youth-friendly regulation that ensures the right of youth, in terms of what the issue of the blogger, who is trying to present issues in his content, in his country context, is being blogged. But then, what profits the world when some other groups are being denied? And what do we achieve if we cannot have a holistic approach, considering every community member that is needed in the space? So that’s my thought, and I feel we, if I’m saying we, I’m speaking on behalf of the youth, that I’m also part of the youth system. So we want to see something that… involves all, and not only on some of our country, our policy, I’m sorry, some of our country, we do not have policies in Africa. Our policies are still in draft. That is key issue. So if it must be completed, you need to have youth voices added to it in terms of the validation process. I’m sure in Europe, most of the policies are already finished or they’re already there in modifications and separate ones are being brought out to ensure direct controls. That is my point.
Tsvetelina Penkova: Perfect. Thank you, Petr King. Insights, you wanted to take the floor and then I would insist, because we’re running out of time, but I would insist to give the floor to our audience if they wanna pose at least one question. So think of questions and I’m giving you the floor now. Thank you.
Audience: Thank you. I just wanted to add something about the blocking of some blogs and also about the abusive content on some social media going against what we want for children. Just recently that happened to me on one of my page on LinkedIn when I was combating misinformation on education and the AI block it because they say it’s going against the community while it was not. So I think while we are designing policy and regulation also, we should not fully just rely on AI to combat harmful content on the internet. Sometimes you see a content, you know that this content, it goes against what children should see. And when you report it just some minutes after you will receive a message saying, We have received your report, but this doesn’t go against the community. Maybe the language that was used in that content was not like English or French or a well-known language. Sometimes it can be a local language, so they don’t know it. They just say, oh, this doesn’t go against what it was. So I think it should be mandatory for all content, for all social media, and for all blogs that are posting things, when people report that this goes against the community, and then when the AI say it’s not, and the person appealed that it should be mandatory for a person, a human being, to see that content, if this is really going against the community or not. But we should not fully rely on AI, and that will add more transparency on content regulation. Thank you.
Yuliya Morenets: Perfect. Another insightful comment here from the room, and now I’m looking at our audience, if anyone wants to ask a question or give a comment to what we’ve heard. Perfect. Thank you. We may have the mics, at least one of them. Can you hear me?
Audience: Okay, thank you. My name is Wouter, I’m from the Netherlands. Thank you for hosting this panel. I think it’s really nice to be involved as youth, even though I would not consider myself youth anymore, but apparently I am here until 35 years. So I think it’s important to consider the effect of this legislation on youth in that sense. But I’m also thinking the perspective of youth is also important in making this legislation, because we were born with the internet, we were formed by it. And I think that’s something that’s maybe overlooked. We only discuss legislation that may have influence on youth. So I’m just maybe asking the critical question to the members of parliament here. Are you equipped enough to have this youth perspective in this legislation process?
Tsvetelina Penkova: Well, I would say this is a quick reflection here. It’s probably me and Brando here, our borderline already, also exiting the proper definition of youth above 35, but we’re close enough. So of course, this is a joke, but we do have, even one of the reasons for having those discussions in this debate is to have the perspective of everyone. So we do tend to make an effort, because I cannot say for sure that we’re succeeding all the time to take the viewpoint of all the stakeholders, because that’s how I started opening the discussion as well, that the perspective of the people who are closer to the matter is much more important, because you understand some of the challenges, some of the difficulties, but also you see some of the opportunities better than us. So we are trying to do our best to do so from all the political groups. I think the colleagues here from the different political groups will also confirm that effort. Of course, if you don’t think we are doing it enough, you can contact us at any point. Our contacts are quite public. I will pass the floor now back to Julia, because I know we are running out of time. Is there another question? Okay, Julia, if you allow me, one more question here from the audience.
Audience: You can hear me, right? Thank you for the opportunity. My name is Chris Junior, and I’m from Zimbabwe. So just a quick rundown. The Internet has done a lot of good. Coming from the Global South, during COVID, I think most of our education was on WhatsApp. That’s where we used in terms of university education, high school education, assignments and all that. everything was done on WhatsApp, right? People couldn’t afford your Zoom or your Google Meet, so most of the things were done on WhatsApp, meaning the internet has been doing a lot of good, right? Depending on the perspectives that you see. There, right. So now also, if you look into it, currently there’s a trend in Southern Africa where a lot of people are now content creators and most people are making a living from the internet. This is from your TikTok, your Facebook, X and all those applications. This is the source of income that they are using for their daily lives. And someone said, we are the young people, we are the internet, we are the users of it, right? And if you notice, even with the university students, most of them, they use content, online trading and all those other platforms as means for them to create extra revenue and extra money for themselves and all that, right? Interestingly enough, right, moving to another point, there was a question or rather a point saying that the EU is, well, they’re setting up policies and all that. I believe that the question that you’ll be asking is, oh, I just searched on the internet right now and the answer was no one owns the internet. It’s just infrastructure and framework. But interestingly enough, it all starts from the United States and Europe. So meaning one way or the other, you guys are involved. And when you set regulations, by the time it comes to, I want to call us the end user in the global South, it still needs that regulation. So you guys setting a regulation still makes sense, but at the same time, we should also be able to include the serious stakeholders, which are the guys, let’s say, for example, the guys in the porn industry, the guys who are owning all these multi-stakeholder companies that your TikToks, your Facebook and all that, but it’s engaging them. I feel like it should also be able to be like in a platform like this one, where there are actual young people who are engaging, should be able to create a platform where. the young people are able to engage with Mark Zuckerberg himself and we are able to also say in regulate as much as in regulating uh our internet we can then uh be able to detect that these are the things that we need and these are the things that we might not need and sometimes because we have a multi-generational panel it’s now easier for the older guys to also show their perspective and the youngest guys also show that perspective uh then lastly sorry uh in terms of for example we’re talking about the australian bill on regulating the internet at 16 year old uh you notice that like i said uh in the global south we use the internet for our education right because there are people who have a lot of access problem and even if in in the cities in the bigger cities sometimes access to schools is so difficult so for extra lessons for whatnot we still use the internet meaning at 13 year 18 years my little brother who’s starting his secondary education still needs whatsapp right it’s more it has more become or more of a major requirement for you to be at school you need access to what to your whatsapps your social media platform because they offer a cheaper alternative for me to engage with my teachers and also to get communication from my schools and what thank you very much
Speaker 5: thank you thank you for reminding us the perspective that working together means actually working together and coordinating coordinating everything we do in a timely manner and not post-factum after something has already been established or accepted by certain parts of of the globe julia i’m gonna pass you the floor now for a few concluding remarks and then we’re gonna have to close this panel and before we all finish i would ask our fellow participants here to to stay a bit after we close it down so we can have a common picture a group picture all together if of course whoever wants to join for that picture julia back to you
Yuliya Morenets: thank you svetlana i think you you just did the great thing i wanted to announce about the photo thank you so much i think what i have to say is we have an amazing incredible discussion in parallel going on online we have around we had around 35 people so i just wanted to know to let you know that That was a discussion on side, but it was also a great discussion online going on at the same time. And there are a number of huge questions that were asked. We’re not able to bring them to you also because there is this question of headset and etc. And what people are saying is that we need to have this conversation on that question, not only one short time in the year, but during the whole year, right? So we will try to take these comments and suggestions for the next year, but we’d like to thank, first of all, the European Parliament delegation for accepting to organise this event and this talk with us. Thank you to all members that were able to attend. Thank you to Per Stoner for being so supportive during the years and being present during this conversation. And thank you to all young leaders, Dana from Canada, Patrick from Liberia, Vlad from which country you’re based now, I don’t know, you’ll tell to the audience, and all other young people that I didn’t mention here. Thank you for the great conversation and thank you for the audience online. We will end with that by saying that a lot needs to be discussed, obviously, how to find that balance and if there is an appetite for that discussion, we will try to find an interesting and challenging format for the next year. Thank you so much and thank you for being with us. With that, stay for the group photo.
Tsvetelina Penkova: Perfect. Thank you. We would also like to thank the audience online from our side here from Riyadh. And if you allow me, I would conclude with the sentence that Dana from Canada opened of the discussion is that the digital economy will shrink if we don’t have a common approach on regulation. So let’s leave the room with that thought and that understanding. Thank you.
Yuliya Morenets: . .
Tsvetelina Penkova
Speech speed
161 words per minute
Speech length
1466 words
Speech time
545 seconds
EU Digital Services Act imposes obligations on platforms to respect users’ rights and protect children
Explanation
The Digital Services Act (DSA) is a regulation that requires online platforms to respect users’ fundamental rights and protect children. It aims to balance protection with digital rights while fostering innovation.
Evidence
The DSA includes specific obligations such as the right to freedom of expression, protection for children, non-discrimination, and protection of personal data.
Major Discussion Point
Regulation of online platforms to protect minors
Agreed with
Pearse O’donohue
Ivars Ijabs
Agreed on
Need for regulation to protect minors online
Differed with
Ivars Ijabs
Speaker 1
Differed on
Approach to regulation
EU proposal on combating child sexual abuse and exploitation online
Explanation
The European Commission proposed a regulation in 2022 to combat child sexual abuse and exploitation online. This regulation provides specific proposals for detecting online sexual abuse related to minors and children.
Evidence
The regulation is currently under discussion between the European Parliament and the Council.
Major Discussion Point
Addressing online harms to youth
Agreed with
Pearse O’donohue
Agreed on
Importance of age verification
Pearse O’donohue
Speech speed
162 words per minute
Speech length
1459 words
Speech time
537 seconds
DSA requires platforms to assess risks to minors and implement mitigation measures
Explanation
The Digital Services Act mandates online platforms and search engines to assess systemic risks arising from their services’ design and use. They must implement effective mitigation measures when risks affecting children are identified.
Evidence
The Commission has started investigations on platforms like TikTok, Facebook, and Instagram regarding their practices affecting minors.
Major Discussion Point
Regulation of online platforms to protect minors
Agreed with
Tsvetelina Penkova
Ivars Ijabs
Agreed on
Need for regulation to protect minors online
Age verification is a critical component being enforced on large online platforms
Explanation
The European Commission is enforcing age verification measures on large online platforms, particularly those providing pornographic content. They are developing a privacy-preserving and interoperable solution for age verification.
Evidence
The Commission has opened cases against platforms like TikTok regarding their age verification practices.
Major Discussion Point
Regulation of online platforms to protect minors
Agreed with
Tsvetelina Penkova
Agreed on
Importance of age verification
Ivars Ijabs
Speech speed
144 words per minute
Speech length
490 words
Speech time
203 seconds
Regulation should balance protection with digital rights and innovation
Explanation
While protecting minors online is crucial, regulations should also consider the learning dimension and digital skills of younger generations. The EU should avoid over-regulating and hampering digital development.
Evidence
The speaker draws parallels between creating safe learning environments in physical and digital spheres.
Major Discussion Point
Regulation of online platforms to protect minors
Agreed with
Tsvetelina Penkova
Pearse O’donohue
Agreed on
Need for regulation to protect minors online
Differed with
Tsvetelina Penkova
Speaker 1
Differed on
Approach to regulation
Speaker 5
Speech speed
135 words per minute
Speech length
1388 words
Speech time
614 seconds
AI Act can help fight cyberbullying by requiring transparency for deepfakes
Explanation
The AI Act can support the fight against cyberbullying by providing more transparency, particularly for deepfakes. It requires labeling or watermarking of AI-generated content to help identify fake material that could be mentally damaging.
Evidence
The speaker gives an example of deepfakes showing minors doing or saying things they didn’t actually do, which could be mentally damaging.
Major Discussion Point
Addressing online harms to youth
Differed with
Audience
Differed on
Role of AI in content moderation
Speaker 3
Speech speed
152 words per minute
Speech length
807 words
Speech time
318 seconds
Need to consider intersectionality in understanding how legislation impacts different youth
Explanation
When developing legislation to protect youth online, it’s important to consider intersectionality. Different social factors like gender, race, and sexuality can affect how a young person experiences online exploitation or harm.
Evidence
The speaker mentions Canada’s Online Harms Act, which includes both child safety and hate speech elements due to the intersectional nature of online harms.
Major Discussion Point
Addressing online harms to youth
EU regulations have a “Brussels effect” influencing policies globally
Explanation
EU regulations on digital issues have a significant impact beyond Europe, influencing policies in other countries. This global influence requires careful consideration and consultation with stakeholders worldwide.
Evidence
The speaker mentions the potential impact of EU policies on Canada’s digital economy.
Major Discussion Point
Global impact of EU internet regulations
Need to consider how EU policies could impact digital economies in other regions
Explanation
When developing digital policies, the EU should consider their potential impact on digital economies in other countries. Poorly implemented regulations could lead to unintended consequences in third countries.
Evidence
The speaker gives an example of how certain regulations could potentially cause businesses to pull out of the Canadian market, shrinking their digital economy.
Major Discussion Point
Global impact of EU internet regulations
Speaker 1
Speech speed
133 words per minute
Speech length
356 words
Speech time
160 seconds
Platforms should not prevent youth from posting content that doesn’t violate rules
Explanation
The speaker argues against excessive regulation of young people’s actions online. They suggest that platforms should not restrict youth from posting content as long as it doesn’t violate the platform’s rules.
Evidence
The speaker mentions their experience from Russia, where restrictions against civil society often start with regulations in the child protection field.
Major Discussion Point
Addressing online harms to youth
Differed with
Tsvetelina Penkova
Ivars Ijabs
Differed on
Approach to regulation
Audience
Speech speed
164 words per minute
Speech length
1072 words
Speech time
390 seconds
Human review is needed, not just AI, when moderating potentially harmful content
Explanation
The audience member argues that platforms should not rely solely on AI for content moderation. They suggest that when AI fails to identify harmful content, it should be mandatory for a human to review the content.
Evidence
The speaker shares a personal experience of AI incorrectly blocking content that was combating misinformation on education.
Major Discussion Point
Addressing online harms to youth
Differed with
Speaker 5
Differed on
Role of AI in content moderation
Youth perspective is crucial in developing effective legislation
Explanation
The audience member emphasizes the importance of including the youth perspective in developing internet legislation. They argue that young people, having grown up with the internet, have unique insights that older legislators might lack.
Evidence
The speaker asks whether the members of parliament are equipped enough to have the youth perspective in the legislation process.
Major Discussion Point
Regulation of online platforms to protect minors
Internet access and social media are crucial for education in Global South
Explanation
The audience member highlights the importance of internet access and social media platforms for education in developing countries. They argue that restricting access could negatively impact educational opportunities.
Evidence
The speaker mentions that during COVID-19, most university and high school education in Zimbabwe was conducted via WhatsApp.
Major Discussion Point
Global impact of EU internet regulations
Coordination needed between EU, tech companies, and youth globally on internet governance
Explanation
The audience member suggests that there should be platforms where young people can directly engage with tech company leaders and policymakers on internet governance issues. This would ensure that regulations consider the perspectives of all stakeholders.
Major Discussion Point
Global impact of EU internet regulations
Speaker 4
Speech speed
149 words per minute
Speech length
48 words
Speech time
19 seconds
Youth in Africa want “youth-centric” internet regulations
Explanation
The speaker argues for youth-centric internet regulations that are youth-friendly and consider the interests of young people. They emphasize the importance of including youth voices in the policy-making process.
Evidence
The speaker mentions that youth will lead future internet governance and cybersecurity efforts, and that many young people in Africa use the internet for income generation.
Major Discussion Point
Global impact of EU internet regulations
Agreements
Agreement Points
Need for regulation to protect minors online
speakers
Tsvetelina Penkova
Pearse O’donohue
Ivars Ijabs
arguments
EU Digital Services Act imposes obligations on platforms to respect users’ rights and protect children
DSA requires platforms to assess risks to minors and implement mitigation measures
Regulation should balance protection with digital rights and innovation
summary
There is a consensus on the need for regulation to protect minors online, but with a balance between protection and digital rights.
Importance of age verification
speakers
Pearse O’donohue
Tsvetelina Penkova
arguments
Age verification is a critical component being enforced on large online platforms
EU proposal on combating child sexual abuse and exploitation online
summary
Speakers agree on the importance of age verification measures to protect minors from inappropriate content.
Similar Viewpoints
Both speakers emphasize the importance of considering diverse youth perspectives in developing internet regulations.
speakers
Speaker 3
Speaker 4
arguments
Need to consider intersectionality in understanding how legislation impacts different youth
Youth in Africa want “youth-centric” internet regulations
Both recognize the global impact of EU regulations and the need for international coordination in internet governance.
speakers
Speaker 3
Audience
arguments
EU regulations have a “Brussels effect” influencing policies globally
Coordination needed between EU, tech companies, and youth globally on internet governance
Unexpected Consensus
Importance of human review in content moderation
speakers
Audience
Pearse O’donohue
arguments
Human review is needed, not just AI, when moderating potentially harmful content
DSA requires platforms to assess risks to minors and implement mitigation measures
explanation
While the EU official focuses on platform responsibilities, there’s an unexpected alignment with the audience member’s call for human review in content moderation, suggesting a shared concern for effective and fair content moderation practices.
Overall Assessment
Summary
There is general agreement on the need for regulation to protect minors online, the importance of age verification, and the consideration of diverse youth perspectives in policy-making. However, there are differing views on the extent and implementation of these regulations.
Consensus level
Moderate consensus on broad principles, but divergent views on specific implementation strategies. This suggests a need for further dialogue and refinement of policies to address various stakeholder concerns while maintaining the core goal of protecting minors online.
Differences
Different Viewpoints
Approach to regulation
speakers
Tsvetelina Penkova
Ivars Ijabs
Speaker 1
arguments
EU Digital Services Act imposes obligations on platforms to respect users’ rights and protect children
Regulation should balance protection with digital rights and innovation
Platforms should not prevent youth from posting content that doesn’t violate rules
summary
While Penkova emphasizes the need for strict regulations, Ijabs argues for a more balanced approach considering innovation, and Speaker 1 advocates for minimal restrictions on youth content.
Role of AI in content moderation
speakers
Speaker 5
Audience
arguments
AI Act can help fight cyberbullying by requiring transparency for deepfakes
Human review is needed, not just AI, when moderating potentially harmful content
summary
Speaker 5 highlights the potential of AI in combating online harms, while the audience member argues for the necessity of human review in content moderation.
Unexpected Differences
Global impact of EU regulations
speakers
Speaker 3
Speaker 4
arguments
Need to consider how EU policies could impact digital economies in other regions
Youth in Africa want “youth-centric” internet regulations
explanation
While both speakers address global impacts, their perspectives differ unexpectedly. Speaker 3 focuses on potential negative economic impacts, while Speaker 4 emphasizes the need for youth-centric regulations in Africa, highlighting different priorities in different regions.
Overall Assessment
summary
The main areas of disagreement revolve around the extent of regulation, the role of AI in content moderation, and the global impact of EU regulations.
difference_level
The level of disagreement is moderate. While there is a general consensus on the need to protect minors online, speakers differ significantly on implementation strategies and the balance between protection and innovation. These differences have important implications for the development of effective and globally applicable internet regulations.
Partial Agreements
Partial Agreements
All speakers agree on the need to protect minors online, but differ on the extent and method of regulation. Penkova and O’donohue support stricter measures, while Ijabs emphasizes the need for balance with innovation.
speakers
Tsvetelina Penkova
Pearse O’donohue
Ivars Ijabs
arguments
EU Digital Services Act imposes obligations on platforms to respect users’ rights and protect children
DSA requires platforms to assess risks to minors and implement mitigation measures
Regulation should balance protection with digital rights and innovation
Similar Viewpoints
Both speakers emphasize the importance of considering diverse youth perspectives in developing internet regulations.
speakers
Speaker 3
Speaker 4
arguments
Need to consider intersectionality in understanding how legislation impacts different youth
Youth in Africa want “youth-centric” internet regulations
Both recognize the global impact of EU regulations and the need for international coordination in internet governance.
speakers
Speaker 3
Audience
arguments
EU regulations have a “Brussels effect” influencing policies globally
Coordination needed between EU, tech companies, and youth globally on internet governance
Takeaways
Key Takeaways
The EU is developing regulations like the Digital Services Act to protect minors online while balancing digital rights and innovation
Age verification is seen as a critical component for protecting youth from inappropriate content
There is a need to include youth perspectives in developing internet regulations
EU internet regulations have global impact, requiring consideration of effects on other regions
Addressing online harms to youth requires a multifaceted approach, including legislation, platform accountability, and user empowerment
Resolutions and Action Items
The EU will implement a temporary privacy-preserving age verification solution until the EU Digital Identity Wallet is fully functioning
The European Commission has opened investigations into several large online platforms regarding their protection of minors
Unresolved Issues
How to effectively balance protection of minors with freedom of expression and innovation online
How to ensure global coordination on internet governance, especially including perspectives from the Global South
How to address the reliance on internet platforms for education and income generation in developing countries while also protecting youth
The appropriate role of AI in content moderation versus human review
Suggested Compromises
Developing AI blockers that users can choose to enable, similar to ad blockers
Combining voluntary actions by platforms with mandatory regulations for sensitive areas
Creating more opportunities for direct dialogue between youth, policymakers, and tech company leaders on internet governance issues
Thought Provoking Comments
Is it acceptable that it’s only stopped in the European Union? Is there a different threshold for the protection of children in other regions?
speaker
Pearse O’donohue
reason
This question challenges the global implications of EU regulations and raises important ethical considerations about child protection standards across regions.
impact
It prompted reflection on the global impact of EU regulations and the need for international cooperation on child protection online.
We don’t want to exclude children from the positive opportunities of the internet but we do actually want to protect them from this age inappropriate [content].
speaker
Pearse O’donohue
reason
This comment articulates a key challenge in regulating children’s internet access – balancing protection with opportunity.
impact
It framed much of the subsequent discussion around finding the right balance in regulation approaches.
Why we no longer believe in the self-regulations that already exist on the platforms?
speaker
Vlad
reason
This question challenges the premise of increased regulation and advocates for trusting existing self-regulation mechanisms.
impact
It shifted the discussion to consider the merits of self-regulation vs. government intervention.
What if we could have regulations for harmful content with AI, that platforms have to develop out AI blockers, similar to how you can have advertisement blockers.
speaker
Dana Kramer
reason
This introduces an innovative technical solution to content moderation that balances user choice with platform responsibility.
impact
It sparked discussion of novel technological approaches to addressing online harms while preserving user autonomy.
Are you equipped enough to have this youth perspective in this legislation process?
speaker
Wouter
reason
This question directly challenges policymakers on their ability to represent youth perspectives, highlighting a potential gap in the legislative process.
impact
It prompted reflection on the inclusion of youth voices in policymaking and the importance of diverse perspectives.
The Internet has done a lot of good. Coming from the Global South, during COVID, I think most of our education was on WhatsApp.
speaker
Chris Junior
reason
This comment provides crucial context from the Global South, highlighting how internet regulation can have vastly different impacts across regions.
impact
It broadened the discussion to consider global perspectives and the unintended consequences of regulation in different contexts.
Overall Assessment
These key comments shaped the discussion by broadening its scope from EU-centric policy considerations to global implications, technological innovations, and diverse stakeholder perspectives. They challenged assumptions about the universality of regulatory approaches and highlighted the complexity of balancing protection with opportunity in digital spaces. The discussion evolved from a focus on specific EU regulations to a more nuanced exploration of the global impact of internet governance decisions, the role of self-regulation versus government intervention, and the importance of including youth and Global South perspectives in policymaking processes.
Follow-up Questions
Is it acceptable that TikTok’s Light Rewards Program is only stopped in the European Union? Is there a different threshold for the protection of children in other regions?
speaker
Pearse O’donohue
explanation
This raises important questions about global standards for child protection online and the potential for regional disparities in safeguards.
How can we balance the need for regulation with maintaining a safe learning environment for digitally native youth?
speaker
Ivars Ijabs
explanation
This highlights the challenge of protecting minors while also allowing them to develop digital skills and learn through experience.
Could AI blockers be developed as a tool for regulating platforms while allowing personal freedom of expression?
speaker
Dana Kramer
explanation
This suggests an area for technological development that could provide a balance between platform regulation and user autonomy.
How can we ensure youth-centric internet regulation that considers the perspectives of young people globally?
speaker
Peter King
explanation
This emphasizes the need for inclusive policy-making that incorporates youth voices, particularly from regions where internet governance policies are still developing.
Are European Parliament members sufficiently equipped to incorporate youth perspectives in the legislative process?
speaker
Wouter (audience member)
explanation
This questions the ability of policymakers to fully understand and represent the interests of digital natives in crafting internet regulations.
How can we create platforms for young people to engage directly with tech industry leaders like Mark Zuckerberg in discussions about internet regulation?
speaker
Chris Junior (audience member)
explanation
This suggests a need for more direct dialogue between youth and tech industry decision-makers to inform policy and platform design.
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Related event
Internet Governance Forum 2024
15 Dec 2024 06:30h - 19 Dec 2024 13:30h
Riyadh, Saudi Arabia and online