Lightning Talk #107 Irish Regulator Builds a Safe and Trusted Online Environment
25 Jun 2025 13:10h - 13:40h
Lightning Talk #107 Irish Regulator Builds a Safe and Trusted Online Environment
Session at a glance
Summary
This discussion featured John Evans, Digital Services Commissioner at Coimisiún na Meán (Ireland’s media and online safety regulator), presenting how the organization contributes to media safety and aligns with the Global Digital Compact commitments. Evans explained that the regulator, established just over two years ago, has an unusually significant role in European digital regulation because many major tech platforms are headquartered in Ireland. The organization operates under six strategic areas: children’s protection, democracy, consumer trust, diversity and inclusion, culture, and public safety, all of which align closely with Global Digital Compact principles of human rights, internet governance, digital trust, and information integrity.
Evans detailed Ireland’s role as a Digital Services Coordinator under the EU’s Digital Services Act, explaining how this involves complex coordination at international, bilateral, and domestic levels. The regulator handles approximately 80% of complaints against online platforms due to Ireland’s status as their European base. He focused particularly on two strategic areas: democracy and children’s protection. Regarding democracy, Evans described extensive work during Ireland’s election year, including developing candidate protection packs and coordinating with other European regulators to address electoral integrity challenges. For children’s protection, he outlined both content-focused approaches through Ireland’s online safety code and systems-focused measures under the Digital Services Act.
The organization has grown rapidly from 40 to over 200 staff members, with plans to reach 300, demonstrating Ireland’s serious commitment to digital regulation. During the Q&A session, Evans addressed questions about resource allocation, policy implementation challenges, and coordination with other regulators, emphasizing the network-based approach of European digital regulation and Ireland’s responsibility to regulate not just for Irish citizens but for all Europeans.
Keypoints
**Major Discussion Points:**
– **Ireland’s unique regulatory role in Europe**: As home to many major tech companies (15 of 25 very large online platforms), Ireland’s media regulator Coimisiún na Meán has an outsized responsibility, handling approximately 80% of complaints against online platforms across Europe through the Digital Services Act framework.
– **Electoral integrity and democracy protection**: The regulator’s comprehensive approach to safeguarding elections, including developing toolkits with other European coordinators, creating candidate protection packs, and implementing measures to combat disinformation while supporting safe political participation online.
– **Child protection and online safety**: A two-pronged regulatory approach addressing both content (through Ireland’s online safety code prohibiting harmful content like self-harm promotion) and systems (through Digital Services Act provisions requiring platforms to protect minors’ safety, security, and privacy).
– **International coordination and network governance**: The complex web of relationships required for effective digital regulation, including cooperation with European Digital Services Coordinators, domestic agencies, NGOs, and international bodies like the Global Online Safety Regulators Network.
– **Resource allocation and enforcement challenges**: The regulator’s growth from 40 to over 200 staff (targeting 300), prioritization strategies based on risk assessment, and the balance between policy development speed and the urgency of addressing online harms.
**Overall Purpose:**
The discussion was a presentation by Ireland’s Digital Services Commissioner explaining how the country’s media regulator contributes to online safety and democratic values, both domestically and across Europe, followed by a Q&A session addressing practical regulatory challenges and enforcement approaches.
**Overall Tone:**
The tone was professional and informative throughout, with the presenter demonstrating confidence in Ireland’s regulatory approach while acknowledging significant challenges. During the Q&A, the tone became more conversational and collaborative, with the commissioner showing openness to dialogue and willingness to share experiences. There was a notably positive moment when an audience member complimented the regulator’s integrity, contrasting it favorably with Ireland’s data protection regulation, which seemed to energize the discussion around Ireland’s evolving regulatory reputation.
Speakers
– **John Evans**: Digital Services Commissioner at Coimisiún na Meán (Ireland’s media and online safety regulator)
– **Maria Farrell**: Irish citizen, digital and human rights activist
– **Audience**: Multiple audience members asking questions (roles/expertise not specified)
**Additional speakers:**
– **Niamh Hannafin**: Assistant Director for International Affairs at Coimisiún na Meán, Ireland’s media and online safety regulator
– **Paul**: Colleague of John Evans involved with new legislation on the democracy side (specific title not mentioned)
Full session report
# Comprehensive Discussion Report: Ireland’s Digital Services Regulation and the Global Digital Compact
## Overview and Context
This discussion featured John Evans, Digital Services Commissioner at Coimisiún na Meán (the Irish language name for Ireland’s media regulator), presenting how the organisation contributes to media safety and aligns with Global Digital Compact commitments. The session, introduced by Niamh Hannafin and including contributions from Maria Farrell (a fellow Irish citizen) and multiple audience members, provided an in-depth examination of Ireland’s unique role in European digital regulation and the practical challenges of implementing comprehensive platform oversight.
The discussion took place against the backdrop of Ireland’s distinctive position in the European digital landscape, where the country hosts 15 of the 25 very large online platforms regulated under the EU’s Digital Services Act. This geographical concentration of major tech companies has transformed Ireland’s media regulator into a body with responsibilities extending far beyond its national borders, effectively making it a key player in protecting digital rights across the entire European Union.
## Ireland’s Unique Regulatory Position and Organisational Structure
Evans began by explaining the extraordinary scope of Coimisiún na Meán’s responsibilities, emphasising that the organisation, just over two years old, handles approximately 80% of complaints against online platforms across Europe. This disproportionate responsibility stems from Ireland’s status as the European headquarters for major technology companies.
The regulator operates under six strategic areas that align closely with Global Digital Compact principles: children, democracy, consumer protection from exploitation and scams, diversity and inclusion, culture, and public safety. These areas correspond directly to the Compact’s focus on human rights, internet governance, digital trust, and information integrity, demonstrating how national regulatory frameworks can support international digital governance objectives.
The organisation’s rapid expansion reflects Ireland’s commitment to its new regulatory mandate. Evans detailed how the regulator has grown from 40 to just over 200 staff members, with plans to reach 300 employees within another six to nine months. Just over half of current staff support online safety work. This dramatic scaling represents a significant investment in regulatory capacity and signals a transformation from the organisation’s previous incarnation as the Broadcasting Authority of Ireland.
## Electoral Integrity and Democracy Protection
One of the most detailed aspects of Evans’s presentation focused on the regulator’s work protecting democratic processes, particularly during Ireland’s election year. The approach demonstrates the complexity of safeguarding electoral integrity in the digital age, requiring coordination across multiple levels and stakeholders.
The regulator developed comprehensive election guidelines and toolkits in collaboration with other European Digital Services Coordinators. These tools address electoral integrity challenges by requiring platforms to implement specific measures, including elevating official sources of information and limiting the spread of disinformation during critical electoral periods.
A particularly innovative initiative was the creation of candidate protection packs, developed in cooperation with Irish police. These resources help politicians understand how to respond when targeted online during elections, providing practical guidance for maintaining safe participation in public life. Evans noted that the regulator is currently conducting research to evaluate the effectiveness of these packs, indicating a commitment to evidence-based policy development.
The democratic protection work extends beyond individual elections to encompass broader concerns about political participation and media freedom. However, this area also highlighted some implementation challenges, particularly regarding new media privilege rules for journalistic content in platform content moderation systems, where platforms remain uncertain about execution requirements.
## Child Protection and Online Safety Framework
Evans outlined a sophisticated two-dimensional approach to protecting children online, addressing both content-specific harms and systemic platform design issues. This comprehensive framework demonstrates how modern digital regulation must operate across multiple regulatory instruments to achieve effective protection.
The content dimension operates through Ireland’s online safety code, which prohibits platforms from hosting harmful material such as content promoting self-harm or eating disorders. This approach focuses on removing specific types of dangerous content that could directly harm young users.
The systems dimension, implemented through the Digital Services Act, requires platforms to protect minors’ safety, security, and privacy through structural changes to their operations. This includes prohibiting addictive design features targeted at children and implementing robust age verification systems. Evans noted that guidance from the European Commission on Article 28 of the Digital Services Act, which specifically addresses minor protection, will emerge later in the year.
The regulator is pursuing coordinated enforcement actions regarding adult sites, following the European Commission’s investigations into four adult sites. Digital Services Coordinators are examining coordinated action for platforms below the 45 million user threshold, with Coimisiún na Meán serving as vice chair of the working group. Additionally, educational initiatives are being expanded, with the “rights rules and reporting online educational resource” distributed to primary schools and plans for cinema-based awareness campaigns targeting parents during summer months, with Evans hoping for “a rainy summer in Ireland as usual” to maximize cinema attendance.
## International Coordination and Network Governance
A significant portion of the discussion addressed the complex web of relationships required for effective digital regulation in an interconnected world. Evans emphasised that the Digital Services Act creates a network-based approach to regulation, moving beyond failed self-regulatory models to establish meaningful coordination between member states and the European Commission.
This network approach involves multiple levels of coordination: international cooperation through bodies like the Global Online Safety Regulators Network, bilateral relationships with other European regulators, and domestic coordination with various agencies and civil society organisations. The complexity of these relationships reflects the inherently cross-border nature of digital platforms and the harms they can facilitate.
Evans highlighted the value of learning from other regulatory approaches, specifically mentioning Australia’s eSafety Commission and the UK’s Ofcom as examples of different models being tested globally. This international perspective suggests that effective digital regulation will emerge through experimentation and knowledge sharing rather than a single prescribed approach.
However, the network model also raises concerns about potential vulnerabilities. An audience member questioned how Ireland would handle coordination with member states that might have weak digital service coordinators or experience rule of law backsliding. Evans responded that the European Commission and Digital Services Board provide protective mechanisms through shared accountability and mutual support, though this remains an area requiring ongoing attention.
## Resource Allocation and Enforcement Challenges
The discussion revealed significant tensions around resource adequacy and prioritisation in digital regulation. An audience member’s direct question about resource allocation prompted Evans to provide detailed insights into how the regulator manages its enormous mandate with finite resources.
The regulator employs a risk-based prioritisation approach, considering factors such as platform reach, user demographics, and past enforcement actions across Europe. This systematic approach attempts to focus regulatory attention on areas where intervention can have the greatest impact on user safety and rights protection.
However, the scale of the challenge remains substantial. Evans acknowledged the tension between the need for rapid action due to the severity of emerging harms and the time typically required for regulatory frameworks to prove their effectiveness. The resource challenge is compounded by Ireland’s responsibility to regulate not just for Irish citizens but for all Europeans using platforms headquartered in Ireland.
## Implementation Challenges and Practical Concerns
One area of discussion emerged around the gap between policy aspirations and regulatory reality. An audience member from the OECD raised concerns about policy discussions that propose interventions without adequate consideration of enforceability and implementation challenges.
This highlighted tensions in digital governance between the pressure to develop responses to digital harms versus the practical constraints facing regulators who must actually implement and enforce policies. The audience member argued that regulatory expertise is often inadequately integrated into policy development processes.
Evans acknowledged this challenge while defending the approach of working within existing frameworks and learning from implementation experience. He provided context about how internet regulation evolved from a “hands-off” approach to current targeted legislation like the Digital Services Act and Digital Markets Act, suggesting that regulatory frameworks must develop iteratively.
## Ireland’s Regulatory Reputation and Independence
A significant moment in the discussion came when Maria Farrell directly addressed Ireland’s regulatory approach. She noted positive recognition of Coimisiún na Meán’s work, stating that the regulator has gained recognition for “acting with strength and integrity as a regulator,” contrasting this with criticism of other regulatory approaches.
Evans responded by emphasising the importance of having a clear strategic direction and mission that serves as a “North Star” for the organisation, suggesting that consistent principles help maintain regulatory independence despite changing contexts. He also noted the importance of political support for the regulator’s mandate.
This exchange highlighted the critical importance of regulatory credibility and independence in digital governance, particularly in jurisdictions where economic interests might otherwise influence regulatory effectiveness.
## Key Areas of Discussion and Consensus
The discussion revealed substantial agreement on several key points. All participants acknowledged the enormous scope and complexity of platform regulation, recognising that limited resources require careful prioritisation and strategic thinking about regulatory intervention.
There was strong agreement on the importance of cross-border regulatory coordination, even while acknowledging the challenges this creates when some member states may have weaker regulatory capacity. The network-based approach of the Digital Services Act was generally viewed as a positive development over previous self-regulatory models.
The discussion also revealed agreement on the urgency of regulatory action despite implementation challenges. While participants acknowledged significant constraints in current approaches, there was consensus that waiting for perfect solutions is not viable given the severity of emerging digital harms.
## Ongoing Challenges and Future Considerations
Several significant issues remain challenging, highlighting areas requiring continued attention. The challenge of coordinating with member states that may have weak digital service coordinators represents a potential concern in the European regulatory network.
The implementation of new media privilege rules for journalistic content in platform content moderation remains unclear, with platforms reportedly uncertain about execution requirements. This reflects broader challenges in translating policy objectives into practical platform operations.
Questions about long-term regulatory sustainability and maintaining effectiveness across changing political contexts represent ongoing considerations for digital regulators.
## Conclusion
This discussion provided valuable insights into the practical realities of digital regulation in contemporary Europe. Ireland’s experience as a Digital Services Coordinator demonstrates both the possibilities and constraints facing regulators tasked with protecting digital rights and democratic values.
The conversation revealed that effective digital regulation requires appropriate legal frameworks, adequate resources, political support, international cooperation, and ongoing adaptation to evolving challenges. The emphasis on evidence-based policy development, international cooperation, and maintaining regulatory independence provides a foundation for continued progress in this critical area of governance.
Evans’s presentation and the subsequent discussion highlighted both the significant challenges and the practical approaches being developed to address digital harms while protecting fundamental rights and democratic processes in an interconnected digital environment.
Session transcript
John Evans: Hello, hi there. Good afternoon. Thank you very much for attending and thank you to IGF. My name is Niamh Hannafin. I’m Assistant Director for International Affairs at Commissioon na mBan, Ireland’s newly established media and online safety regulator. I’m very pleased to introduce to you our commissioner, our Digital Services Commissioner, John Evans Evans, who’s going to talk you through the ways in which we are contributing to a healthy media and online landscape in Ireland, but also towards some of the key commitments of the Global Digital Compact. Over to you, John Evans. Thanks, Niamh. Okay. So I can see the slides. So I guess first of all, Commissioon na mBan, it’s the Irish language for the media regulator. So we’re a new regulator. We’re just over two years old. We have a pretty broad mandate covering online safety through media development with a particular emphasis on Irish culture as well, which is an important part of our identity as an organisation. We have an unusual or sort of an outsized role in the European setting because so many of the large, the very large online platforms, so many of the large big tech companies are established in Ireland. And our mission, let me say a quick word about the companies we regulated. So that I mentioned media development, I mentioned online safety, and I mentioned kind of broadcasting regulation. And you can see just quite a variety of recognisable brands in there. It means quite a span of work for the organisation. You’ll see that our mission here, particularly recognises the role that the media plays in underpinning fundamental rights and in fostering an open and democratic pluralistic society. CNAM’s vision of a thriving, diverse, creative, safe and trusted media landscape and our strategic direction very closely align with the Global Digital Compact. The Global Digital Compact emphasises human rights, internet governance, digital trust and information integrity. And as you’ll see, as I’m talking through our strategy in a moment and then a few examples that these principles kind of shine through very clearly. So this is like a busy sort of a slide, but what I’ll talk to you very quickly is the six strategic areas or areas of emphasis. Children, so we want a media landscape that upholds the rights, wellbeing and development of children and their safe engagement with content. Democracy, a media landscape that supports democracy, democratic values and underpins civic discourse and reduces the impact of disinformation. We also want a media landscape that consumers can trust where they are protected from exploitation and scams. Diversity and inclusion, a media landscape that promotes the values of justice, equality and diversity. And then finally, culture, a media landscape that is sustainable, pluralistic and participative and that reflects who we are as a society. And again, as an Irish person, our culture is emphasised very importantly in our mission. The last one is public safety. This is kind of a broad one and it captures everything from terrorist content online through to, for example, a response to emergency situations. A word on our regulatory approach. Empowering people and ensuring that they have the tools to understand media, they have information to make decisions, to make good decisions, that’s part of our toolbox. Supporting and developing the Irish media landscape. We see that as very symbiotic between, on the one hand, the cultural aspect but on the other, navigating the online world. So there, for example, journalism schemes that we would support is a very important part of our toolkit. We are a research and future-focused organisation so it’s important to understand from a market perspective and technology perspective what the future is going to look like and how we can expect things to change and how the regulatory response should adapt as we move on. At the core, however, is holding regulated entities to account. So our role is really moving beyond a self-regulatory model which, in many respects, hasn’t worked. I want to say a bit more about an internet governance ecosystem. So within our delivery tools, if you like, we have included, on the one hand, collaborating for impact and then also influencing the European framework. So what I’m going to talk a little bit about is sort of the C, the coordinator, in the Digital Services Coordinator role. So under the Digital Services Act, Commissioner Mann is the Digital Services Coordinator for Ireland. And that C is quite a complicated role, that coordinating role is quite complicated. So if you think at the international level, the fulcrum of what we do is really around participation in the European network along with other Digital Services Coordinators and the European Commission, but also in other media networks supporting regulation. Bilateral relationships with other Digital Services Coordinators across Europe, other media regulators frequently, but other kinds of regulators as well, is very important. The reason for this is that on an operational level, when someone wants to complain about a platform that is established in Ireland, they need to make the complaints to their local Digital Services Coordinator and that’s transmitted to the Irish regulator. So that means that the Irish regulator is responsible for dealing with upwards of 80% of the complaints against online platforms. So that’s a very operational role that we have. And then in the experience sharing space, we try to participate in other kind of fora that go beyond Europe. So for example, the OECD, some of the UN organizations, but also the Global Online Safety Regulators Network of which we’re a founding member. Then domestically, it’s also reasonably complicated. So we’re not the only competent authority under the Digital Services Act in Ireland. Our competition and consumer protection is also one. So relationships with that organization are very important. Our police service on Garda Síochána we do need to cooperate with them as well. They have a role under the Digital Services Act and more broadly as a complimentary agency in the online safety space. Other digital regulators. So for example, we have close relationships with our Telecommunications Regulator and importantly our Data Protection Commission with whom we’re also drafting a cooperation agreement. But then there’s a wider set again of agencies that are involved in the different areas of harm that I mentioned, say, or that are addressed by our strategic objectives. So for example, the Electoral Commission in the electoral integrity space, that’s one example. But there are many departments and agencies that fall into this category from our Department of Health through to our Electoral Commission and in many NGOs. So non-government organizations are also very important in this area. So the coordinating role is really quite a demanding role in terms of internet governance bodies in this space. Now of the six strategic objectives that I outlined earlier, I want to go into a bit more detail about two of them. They are the democracy one and the children one, okay? And the reason I want to spend a bit more time on these now is because as I said at the beginning, beginning, right at the beginning, we’re a new agency and we’ve been developing our capacity to address these areas of these different strategic objectives. Two very important ones right from the outset were democracy and minor protection. So on the democracy side last year was, you know, it’s often said now it was the year of elections and in Ireland it was no difference. So last year we had a European parliamentary elections, we had a referendum, we had local elections and we also had general election and then actually later this year we will have a presidential election so there’s no let-up. But across Europe, right across Europe, there were European elections obviously and many general elections as well in which we had some role. So we engaged intensively in our own elections but we played a supportive role within the network of digital services coordinators across Europe. So digital regulation, it works best when there is coordination across countries. Elections have become a lightning rod, if you like, for newly created governance structures, for example in the DSA and DSC’s last year and this year worked closely with the EEC to share best practice, exchange election experiences and collectively solve problems and develop tools. And it’s it’s a really good example of where the Digital Services Act and the network of agencies involved working together in a horizontal way to address problems can be quite effective. So while an election is not an emergency, it does require a degree of agility on the part of regulators, they need to be responsive to changing circumstances. So with the EU, the digital services coordinators have developed a toolkit, if you like, to help address some of the challenges that arise in the context of electoral integrity. So first, early last year, it was one of the first things that the Digital Services Board, the newly established board under the Digital Services Act, approved was the guidelines, the election guidelines. So the guidelines include measures aimed at platforms, recommendations aimed at platforms for measures which mitigate the risks to electoral integrity, such as elevating official sources of information around electoral processes, demonetizing and limiting the spread of disinformation, labelling political advertising and importantly an onus on platforms to build internal teams that are capable of addressing national local elections. As most of the, actually 15 of the 25 very large online platforms are based in Ireland, we had to participate, we were privileged to participate in many of the pre-election preparations in the in different member states. So that involved attending workshops and scenario planning and then roundtables involving whoever the local agencies and bodies who were in the kind of local electoral ecosystem. So we did one of our own of these and so at that we would have had our Electoral Commission, we would have had the platforms, we would have had some fact-checking agencies, we had also An Garda Síochána very importantly, our police force. There’s a couple of extra points I just want to make in relation to electoral integrity. One is that, this is very important I feel, is supporting that safe participation of politicians in public life. We undertook a specific initiative with our police force, An Garda Síochána, last year to develop a candidate pack. So the candidate pack was developed first for the European elections and local elections and then further enhanced for our general election. The candidate pack was aimed at candidates participating in these elections, so they would know what to do and have to hand quickly information about how to respond when they were targeted, if they were targeted for whatever reason, online. We feel that made a difference but we’re conducting research at the moment to find out exactly how that helped and I think this is an area where we look to develop. The second area that I want to focus on is child rights or minor protection. So children’s rights and the protection online has become an issue of concern worldwide and it’s critical that action is taken. Different regulatory approaches are being tested in different parts of the world, the social media bans for example are being proposed in several countries. In Ireland and within Europe we see this problem as having two dimensions, first is sort of a content dimension and the second is systems. So on the legislative instruments that we have are our Digital Services Act which we feel addresses principally the systems aspect and then our online safety code which comes from the Audiovisual Media Services Directive as part of the transposition of that directive international law. So our online safety code, it clearly defines and lays out the kind of content that children need to be protected from. So regulated platforms must preclude the uploading and sharing of content that promotes self-harm or suicide, eating and feeding disorders and cyberbullying. They also requires the use of age assurance to ensure that children are not normally exposed to videos that contain pornographic or depictions of gross gratuitous violence. There are also provisions relating to parental controls. On the system side, the Digital Services Act on the other hand, it’s a content neutral instrument and instead it has provisions that mean that platforms need to take appropriate measures to protect the safety, security and privacy of minors, that’s the wording of Article 28 of the Digital Services Act, the safety, security and privacy of minors. How platforms are supposed to go about implementing that article of the DSA will be informed by guidance that the European Commission has recently consulted on and which will emerge later this year. The draft guidelines you’ll have seen are quite extensive and but they cover issues covering related to prohibiting addictive design features, age verifications to prevent minors viewing age-inappropriate content, having child accounts or accounts aimed at teenagers set to the highest level of privacy and recommender systems that do not result in the repeated exposure of content that could pose risks to their safety or security. It’s important to say as well that there are also key enforcement activities already underway, so just a couple of weeks ago the European Commission announced the opening of investigations into four adult sites, so these are very large online platforms and so they fall within the purview of the European Commission, but to complement that action the Digital Services Coordinators who have responsibility for below threshold, so these are platforms including adult platforms of which there are many, that have numbers of users below the 45 million threshold that defines the very large online threshold. So to complement that action that the European Commission is taking, the Digital Services Coordinators across the Member States are looking into a coordinated action to address the similar problems arising on the below threshold adult sites. Commissioner Naaman is quite active on that, we are the vice chair of the working group of the Digital Services Board that is looking to help develop a coordinated action. But it’s not all just about enforcement as well, aside from our regulatory powers Commissioner Naaman also supports rights of children through other initiatives such as raising awareness and media literacy efforts, so last year for example we published rights rules and reporting online educational resource and that has been distributed to primary schools throughout the country and later this year we’re looking to extend that to primary age children as well. Alongside that we will be doing kind of a fairly extensive awareness raising campaigns to support one the schools but also parents and during the summer we’re actually going to run a media campaign in cinemas in the hope that we’ll have a rainy summer in Ireland as usual and parents will take their children to see see movies and we’ll get to see that particular advertisement. Just a couple of comments to round up and the challenges that we face as digital regulators whether it’s promoting children’s welfare or safeguarding democracy, they’re not isolated issues, they’re interconnected challenges that require a coordinated and innovative response that put fundamental rights at the centre. Ireland’s unique position as home to some of the major tech companies means that we have quite a heavy responsibility but also an opportunity. We’re not just regulating for Ireland in many respects, we’re also regulating for European citizens. The global digital compact and division of an inclusive open safe and secure digital space is not just an aspiration, it’s a practical framework and it’s reflected very very clearly in our organisation’s strategic statement. As we look forward Commissioner Mann remains committed to not just regulating the digital future but actively shaping it in service of an open democratic and pluralistic society. The work we do today will determine whether technology serves humanity’s aspirations or undermines them so I think we are at a critical moment in Europe in particular and we will see whether or not the regulatory measures and systems and frameworks that we put in place and which are now developing that they will have impact. It’s an interesting time. Thank you. Any questions now, if anybody’s interested? Thank you very much. It sounds like you have a lot of obligations and a huge task and not to put more pressure on you but we kind of count on you to take you know to take on the battle against the platforms for the rest of us in the EU. How do you make your choices? Do you have enough resources? What is your policy on prioritising with the resources that you have given all the challenges that there are? Yeah sure, when we started we just had 40 people in the organisation that was the Broadcasting Authority of Ireland that was a kind of legacy organisation. They did traditional broadcasting regulation so to that remit was added the online safety brief which is huge right and we’re now at just over 200 people and we think within another six to nine months we’ll be at 300 people. About a hundred and just over half I think of those are kind of in one way or another supporting the online safety side of the work of the organisation. So Ireland has taken the responsibility quite seriously and really put the resources into that and any time we’re asked politically we always get the support that we’re looking for and it is an important mandate and society has rallied around it. We could even see it in the kinds of people that have come to work with us since we had a concern maybe that we might not be able to attract because we’re a public service organisation great people but we have people have been really interested in in the mandate. Second thing I’d say is that we’re not alone as a regulator here I do want to emphasise the network nature of the regulatory approach in Europe. So you know my French colleagues often describe the digital services that network or digital services I can act on the network approach as a sort of a team with the European Commission being the captain. Ireland has a very important role to play not because so many of the platforms are based here and but we do have the support of the Commission and also other digital services coordinators. On prioritisation we have said publicly and we are developing the mechanisms in the background to see how we can focus our regulatory efforts more precisely. So at the very highest level I articulated you know if you just set aside the Irish culture one for just for a second the other five you can kind of invert them and think of them as areas of harm. So online hate, the undermining of electoral processes and so on. Those are kind of five kind of high-level areas that we try to focus on. But on top of that we’ve tried to layer or we’re trying to layer kind of this what we call this risk-based approach. So what kind of what kind of reach does a platform have? Does it for example I have a lot of young users? If it does then it’s going to move up at the move up the rankings in terms of potential risk under the child protection strategic objective. And there’s many of those kind of characteristics that you can observe and from service characteristics to how many for example takedown orders have been issued by competent authorities across Europe against the particular platform. That tells you how a particular platform is setting up its trust and safety business.
Audience: Thanks so much for this interesting overview. I have two questions that are somehow interrelated. And the first one is so I work at the OECE in a policy space where very often I think for this big if we go back to kind of like this power concentration and big tech and platform perspective very often in the policy discussions there are calls for interventions that are maybe not necessarily enforceable right. So very often then in this discussion we have and people raising concerns about this is not implementable or enforceable from the legal side or from the regulatory side. So my first part of the question would be do you feel that you with your expertise and specific and background knowledge on how complex these issues are that this is also taken up on the other side of the spectrum in the policy and regulatory development whether there’s kind of this interlinkage. And then the second part of the question which is maybe too specific and please feel free to ignore it if it’s too specific but I’d be interested to hear specifically your thoughts on this because you also mentioned the elevation of authoritative content in the election context but now there is also the specific rule on media privileges so this privilege of journalistic content on platform content moderation which is also very contested and discussed and it will now be I think applicable as of August and at least from the platforms that we spoke with they don’t really know how to do it yet so I wonder if if you have already prepared for that if you already have some some approach of of how to to approach it from a regulatory and regular regulatory body and enforcement side.
John Evans: Thanks. Okay yeah the second one is quite specific but I’d be happy to talk to you we’re getting ready as well so but I’d be happy to talk to you afterwards and I have a colleague here Paul who’s kind of involved with the on some of the new legislation coming down to track on the democracy side so I’ll be happy to chat. But on the skills do we have or does the policy side have the the requisite skills to to carry out the mandate okay and with new recommendations new approaches being proposed all the time. Sometimes it is hard to keep up it’s it’s it’s if you cast your mind back 15-20 years and the approach to the regulation of the internet was let’s let’s keep our hands off it for the moment let’s see how it develops Gradually, problems started to emerge very early. It was perceived to be around concentration issues, so competition policy was seen to be maybe an appropriate measure, consumer protection measures to a degree, but it became apparent over a number of years that there was certain characteristics of the platform economy that were unique and were driving different dynamics that the regulatory systems were not capable of handling effectively. So enter Digital Services Act and the Digital Markets Act in the late teens, and the Digital Services Act and the Digital Markets Act, they’re kind of twins if you like, were trying to address the emerging harms that had become, that were becoming apparent. Both of those pieces of legislation were pushed through in really quite a, in a speedy way. It’s kind of, European legislation takes time to develop and emerge and I think they stand out as being, as having been done quite quickly. But also I think it’s recognized that they are a first step in developing a comprehensive, efficient, streamlined regulatory process. So you always have this tension between trying to work within the system that you have and people at the same time recommending actually that’s not going to work well, you need to try this, you need to, I think we need to try what we have first, see what works, learn from it and develop new things as we mature. But I think the problem is that the harms are perceived as really quite severe and we just don’t have time to wait and see how things mature. If you look back at a different regulatory system, say telecommunications for example, there the framework was developing and evolving over a period of 25 years as competition was embedded in European telecommunications markets. I don’t think we have that same privilege of waiting to see how things develop, we need to move quite quickly in the online safety space, I think. I don’t know if that answers the question, but I’m happy to chat, yeah.
Maria Farrell: Hi, I’m Maria Farrell, I’m a fellow Irish citizen and I have a compliment for you and a question. And the compliment is that amongst other digital and human rights activists around Europe, Camargo na mBan has already a reputation of acting with strength and integrity as a regulator, which has been completely lacking in our data protection regulator and how Ireland deals with tax and the tech co’s. So you guys have completely are changing the narrative on what we can do as a country to actually stand up to our responsibility to regulate these firms that are headquartered in Ireland. My question to you is what are you doing and can you do to ensure that you continue to act with that strength, with that integrity, with that moral courage that says, you know, we are going to stand up to these firms and stand in defence, in active defence of European democracy?
John Evans: I always give a two-part answer to this. The first is that we’re really quite proud of the strategy document that we put together, we think it’s a pretty solid North Star for us. So we were supposed to refresh and renew these things every every two or three years or so. We don’t expect the top our mission and the strategic objectives to change dramatically over the next while. Those are going to be consistent North Stars if you like. So we think we have the strategic direction quite well set and embedded in the organisation and I think we think that will endure. The other side of it is is that just from a resource resource wise, we’re not pushing closed doors in a sense that there is an expectation that we will act and we’re happy to do that, you know, we’re happy to do that. But independent regulation is independent regulation, political context changes, but until somebody changes the law we’re going to enforce the law to the best of our ability.
Audience: I’ll try as best as I can to make it short. I was wondering the DSA being sort of content agnostic, how do you see a role as a regulator in this network of other regulators, especially in relation to member states where there may be rule of law backsliding. So how would you relate as an Irish media commission to member states with perhaps the last strong digital service coordinator or making decisions that you do not agree with from a rule of law perspective.
John Evans: I think part of the protection against that is the role that the European Commission has to play, but also the Digital Services Board. So we get to hold each other to account and but also support each other within that network. And really I think the key pieces, you know, there’s very important articles that the digital services coordinators have a shared responsibility with the European Commission, but the piece around the systemic article 34, article 35, those are really the core strategic pieces, the central planks of the Digital Services Act and I think those are the best protections. We’re happy to chat, yeah, yeah, yeah, can I do another one, yeah? Gosling, yeah, yeah, yes, we’re part of Gosling, yeah, yeah, I kind of described participation in the Digital Services Board as strategic and there’s a kind of operational aspect to that. The Global Online Safety Regulators Network, that’s really excellent in trying to understand different regulatory approaches in different countries and what are best practices, because often Australia’s eSafety Commission is well ahead of us in terms of, kind of, along the regulatory path than we are in certain respects, and there’s a lot for us to learn from that. The Ofcom is a member of that as well and they’re a well-established very expert regulator for whom we have a lot to learn, but also we’re happy to share the European experience in that network as well, yeah, yeah. Okay, I’d better go, sorry, but I’m happy to chat. Thanks.
John Evans
Speech speed
143 words per minute
Speech length
4218 words
Speech time
1765 seconds
Ireland’s Role as Digital Services Coordinator and Regulatory Framework
Explanation
Ireland has a disproportionately large responsibility in European digital regulation because many major tech platforms are headquartered there. This means the Irish regulator must handle approximately 80% of all complaints against online platforms across Europe, requiring significant coordination with other regulators and the European Commission.
Evidence
15 of the 25 very large online platforms are based in Ireland; complaints from other EU countries are transmitted to the Irish regulator; Ireland expanded from 40 to over 200 staff with plans to reach 300; the Digital Services Act creates a network approach with Ireland as Digital Services Coordinator
Major discussion point
Ireland’s unique position and responsibility in European digital regulation
Topics
Legal and regulatory | Jurisdiction | Data governance
Electoral Integrity and Democracy Protection
Explanation
Digital Services Coordinators across Europe developed comprehensive guidelines and toolkits to protect electoral integrity during the ‘year of elections.’ These measures require platforms to take specific actions like elevating official information sources, limiting disinformation spread, and building internal teams capable of addressing national elections.
Evidence
Ireland had European parliamentary elections, referendum, local elections, and general election; Digital Services Board approved election guidelines; platforms must elevate official sources, demonetize disinformation, label political advertising; pre-election workshops and scenario planning conducted with Electoral Commission, platforms, fact-checkers, and police
Major discussion point
Coordinated European approach to protecting democratic processes online
Topics
Sociocultural | Content policy | Human rights | Freedom of expression
Agreed with
– Audience
Agreed on
Gap between policy development and regulatory implementation
Child Protection and Online Safety
Explanation
Ireland employs a dual approach to child protection online, addressing both harmful content through specific safety codes and systemic issues through the Digital Services Act. This comprehensive strategy includes both regulatory enforcement and educational initiatives to protect minors from various online harms.
Evidence
Online safety code prohibits content promoting self-harm, suicide, eating disorders, and cyberbullying; requires age assurance for pornographic content; Digital Services Act Article 28 requires platforms to protect safety, security and privacy of minors; coordinated enforcement actions against adult sites; educational resources distributed to primary schools; cinema advertising campaigns planned
Major discussion point
Comprehensive approach to protecting children online through regulation and education
Topics
Human rights | Children rights | Cybersecurity | Child safety online
Regulatory Challenges and Resource Allocation
Explanation
The regulator uses a risk-based approach to prioritize enforcement efforts, considering factors like platform reach, user demographics, and past enforcement history. Unlike traditional regulatory sectors that had decades to develop, online safety regulation must move quickly due to the severity of emerging harms.
Evidence
Five high-level areas of harm identified; risk assessment considers platform characteristics like number of young users and takedown orders issued by authorities; comparison to telecommunications regulation which developed over 25 years; over half of 200+ staff support online safety work
Major discussion point
Balancing limited resources against urgent need for effective platform regulation
Topics
Legal and regulatory | Data governance | Consumer protection
Agreed with
– Audience
Agreed on
Gap between policy development and regulatory implementation
Ireland’s Regulatory Reputation and Independence
Explanation
The regulator maintains independence through consistent strategic direction and strong organizational mission that serves as a North Star regardless of political changes. The commitment is to enforce existing laws to the best of their ability until laws are changed through proper channels.
Evidence
Strategic document serves as North Star; mission and strategic objectives expected to remain consistent; political support consistently provided when requested; independent regulation means enforcing law regardless of political context
Major discussion point
Maintaining regulatory independence and integrity in politically sensitive tech regulation
Topics
Legal and regulatory | Human rights principles | Jurisdiction
Cross-Border Regulatory Coordination
Explanation
Protection against weak regulation in some member states comes through the shared responsibility structure of the Digital Services Act, where the European Commission and Digital Services Board provide mutual accountability. International networks facilitate sharing of regulatory best practices across different jurisdictions.
Evidence
European Commission serves as ‘team captain’ in network approach; Digital Services Board enables regulators to hold each other accountable; Articles 34 and 35 are core strategic pieces of DSA; Global Online Safety Regulators Network shares best practices; Australia’s eSafety Commission and UK’s Ofcom provide regulatory expertise
Major discussion point
Ensuring consistent regulatory standards across jurisdictions with varying capabilities
Topics
Legal and regulatory | Jurisdiction | Human rights principles
Agreed with
– Audience
Agreed on
Importance of cross-border regulatory coordination
Audience
Speech speed
164 words per minute
Speech length
382 words
Speech time
139 seconds
Electoral Integrity and Democracy Protection
Explanation
There are concerns about the gap between policy recommendations and practical enforceability in digital regulation. Many policy discussions propose interventions that may not be legally or technically implementable, raising questions about whether regulatory expertise is adequately considered in policy development.
Evidence
OECD policy discussions often feature calls for interventions that are not necessarily enforceable; concerns raised about implementability from legal and regulatory perspectives; specific mention of media privileges rule for journalistic content that platforms don’t know how to implement
Major discussion point
Gap between policy aspirations and regulatory implementation capabilities
Topics
Legal and regulatory | Jurisdiction | Human rights | Freedom of the press
Agreed with
– John Evans
Agreed on
Gap between policy development and regulatory implementation
Regulatory Challenges and Resource Allocation
Explanation
Questions arise about whether regulators have sufficient resources and capacity to handle the enormous scope of platform regulation, especially given Ireland’s responsibility for regulating on behalf of all EU citizens. There’s concern about the regulator’s ability to make appropriate prioritization choices given limited resources.
Evidence
Recognition of Ireland’s huge task and obligations; acknowledgment that ‘we kind of count on you to take on the battle against the platforms for the rest of us in the EU’; questions about resource adequacy and prioritization policies
Major discussion point
Adequacy of regulatory resources for the scale of platform oversight needed
Topics
Legal and regulatory | Data governance | Consumer protection
Agreed with
– John Evans
Agreed on
Gap between policy development and regulatory implementation
Cross-Border Regulatory Coordination
Explanation
Concerns exist about how to maintain effective coordination when some member states may have weak digital service coordinators or experience rule of law backsliding. This raises questions about the integrity of the network approach when some nodes in the network may be compromised.
Evidence
Specific concern about member states with rule of law backsliding; questions about relating to member states with weak digital service coordinators; concerns about disagreeing with decisions from a rule of law perspective
Major discussion point
Maintaining regulatory network integrity when some member states have governance challenges
Topics
Legal and regulatory | Jurisdiction | Human rights principles
Agreed with
– John Evans
Agreed on
Importance of cross-border regulatory coordination
Maria Farrell
Speech speed
176 words per minute
Speech length
155 words
Speech time
52 seconds
Ireland’s Regulatory Reputation and Independence
Explanation
Ireland’s media regulator has earned recognition among European digital and human rights activists for demonstrating strength and integrity in platform regulation. This represents a significant departure from Ireland’s previous reputation regarding tech company regulation, particularly in data protection and taxation areas.
Evidence
Reputation among digital and human rights activists across Europe for acting with strength and integrity; contrast with criticism of Ireland’s data protection regulator and tax treatment of tech companies; recognition that the regulator is ‘changing the narrative on what we can do as a country’
Major discussion point
Ireland’s transformation from regulatory haven to responsible platform oversight
Topics
Legal and regulatory | Human rights principles | Privacy and data protection
Agreements
Agreement points
Resource adequacy and prioritization challenges in digital regulation
Speakers
– John Evans
– Audience
Arguments
Regulatory Challenges and Resource Allocation
Regulatory Challenges and Resource Allocation
Summary
Both acknowledge the enormous scope and complexity of platform regulation, with limited resources requiring careful prioritization. There’s recognition that Ireland faces a disproportionate responsibility for EU-wide platform oversight.
Topics
Legal and regulatory | Data governance | Consumer protection
Importance of cross-border regulatory coordination
Speakers
– John Evans
– Audience
Arguments
Cross-Border Regulatory Coordination
Cross-Border Regulatory Coordination
Summary
Both recognize the critical need for effective coordination between regulators across jurisdictions, though they acknowledge challenges when some member states may have weaker regulatory capacity or governance issues.
Topics
Legal and regulatory | Jurisdiction | Human rights principles
Gap between policy development and regulatory implementation
Speakers
– John Evans
– Audience
Arguments
Regulatory Challenges and Resource Allocation
Electoral Integrity and Democracy Protection
Summary
Both acknowledge the tension between policy aspirations and practical enforceability, with John Evans noting the need to move quickly despite not having the luxury of gradual development like telecommunications regulation, while audience members raise concerns about implementability of policy recommendations.
Topics
Legal and regulatory | Jurisdiction | Human rights
Similar viewpoints
Both recognize and emphasize Ireland’s transformation into a regulator that acts with strength and integrity, representing a significant departure from previous approaches to tech company oversight in Ireland.
Speakers
– John Evans
– Maria Farrell
Arguments
Ireland’s Regulatory Reputation and Independence
Ireland’s Regulatory Reputation and Independence
Topics
Legal and regulatory | Human rights principles | Privacy and data protection
Both acknowledge the complexity of protecting democratic processes online and the challenges of implementing media privileges and content moderation policies, though they approach from different perspectives of implementation versus policy development.
Speakers
– John Evans
– Audience
Arguments
Electoral Integrity and Democracy Protection
Electoral Integrity and Democracy Protection
Topics
Sociocultural | Content policy | Human rights | Freedom of expression | Freedom of the press
Unexpected consensus
Ireland’s regulatory transformation and credibility
Speakers
– John Evans
– Maria Farrell
Arguments
Ireland’s Regulatory Reputation and Independence
Ireland’s Regulatory Reputation and Independence
Explanation
It’s unexpected to see such strong consensus between a regulator and an activist about the regulator’s performance. Maria Farrell’s explicit praise for the regulator’s strength and integrity, contrasted with criticism of other Irish regulatory bodies, suggests genuine recognition of effective regulatory action rather than typical regulatory capture or weakness.
Topics
Legal and regulatory | Human rights principles | Privacy and data protection
Urgency of regulatory action despite implementation challenges
Speakers
– John Evans
– Audience
Arguments
Regulatory Challenges and Resource Allocation
Electoral Integrity and Democracy Protection
Explanation
Despite acknowledging significant implementation challenges and resource constraints, there’s consensus that waiting for perfect solutions is not an option due to the severity of emerging harms. This represents agreement on the need for imperfect but immediate action over delayed comprehensive solutions.
Topics
Legal and regulatory | Human rights | Data governance
Overall assessment
Summary
The discussion reveals strong consensus on the fundamental challenges facing digital regulation: resource constraints, implementation complexity, and the need for cross-border coordination. There’s also unexpected agreement on Ireland’s regulatory transformation and the urgency of action despite imperfect tools.
Consensus level
High level of consensus on challenges and approach, with constructive dialogue rather than adversarial positions. This suggests a mature understanding of regulatory realities and shared commitment to effective platform oversight, which bodes well for continued cooperation and development of regulatory frameworks.
Differences
Different viewpoints
Policy Development vs. Regulatory Implementation Gap
Speakers
– John Evans
– Audience
Arguments
Regulatory Challenges and Resource Allocation – The regulator uses a risk-based approach to prioritize enforcement efforts, considering factors like platform reach, user demographics, and past enforcement history. Unlike traditional regulatory sectors that had decades to develop, online safety regulation must move quickly due to the severity of emerging harms.
Electoral Integrity and Democracy Protection – There are concerns about the gap between policy recommendations and practical enforceability in digital regulation. Many policy discussions propose interventions that may not be legally or technically implementable, raising questions about whether regulatory expertise is adequately considered in policy development.
Summary
John Evans advocates for working within existing regulatory frameworks first and learning from implementation, while the audience member argues that policy development often proposes unenforceable interventions without adequate consideration of regulatory expertise and practical implementation challenges.
Topics
Legal and regulatory | Human rights | Freedom of the press
Unexpected differences
Regulatory Timeline and Urgency
Speakers
– John Evans
– Audience
Arguments
Regulatory Challenges and Resource Allocation – Unlike traditional regulatory sectors that had decades to develop, online safety regulation must move quickly due to the severity of emerging harms.
Electoral Integrity and Democracy Protection – There are concerns about the gap between policy recommendations and practical enforceability in digital regulation.
Explanation
While both parties acknowledge the urgency of digital regulation, they have opposing views on how to balance speed with effectiveness. Evans argues for rapid implementation despite imperfections, while the audience suggests that rushing may lead to unenforceable policies. This disagreement is unexpected because both parties want effective regulation but fundamentally differ on the risk-reward calculation of moving quickly versus ensuring implementability.
Topics
Legal and regulatory | Human rights | Jurisdiction
Overall assessment
Summary
The main areas of disagreement center on the balance between policy ambition and regulatory practicality, resource adequacy for the scale of platform oversight, and the effectiveness of current cross-border coordination mechanisms.
Disagreement level
Moderate disagreement with significant implications. While speakers share common goals of effective platform regulation and protection of democratic values, their different perspectives on implementation approaches could lead to tensions between policy development and regulatory execution. The disagreements suggest a need for better integration between policy-making and regulatory expertise to ensure that ambitious digital governance goals are matched with practical enforcement capabilities.
Partial agreements
Partial agreements
Similar viewpoints
Both recognize and emphasize Ireland’s transformation into a regulator that acts with strength and integrity, representing a significant departure from previous approaches to tech company oversight in Ireland.
Speakers
– John Evans
– Maria Farrell
Arguments
Ireland’s Regulatory Reputation and Independence
Ireland’s Regulatory Reputation and Independence
Topics
Legal and regulatory | Human rights principles | Privacy and data protection
Both acknowledge the complexity of protecting democratic processes online and the challenges of implementing media privileges and content moderation policies, though they approach from different perspectives of implementation versus policy development.
Speakers
– John Evans
– Audience
Arguments
Electoral Integrity and Democracy Protection
Electoral Integrity and Democracy Protection
Topics
Sociocultural | Content policy | Human rights | Freedom of expression | Freedom of the press
Takeaways
Key takeaways
Ireland serves as a critical hub for European digital regulation, handling approximately 80% of complaints against online platforms due to many major tech companies being headquartered there
The Digital Services Act creates an effective network-based regulatory approach requiring coordination between member states and the European Commission, moving beyond failed self-regulatory models
Ireland has demonstrated serious commitment to digital regulation by expanding from 40 to over 200 staff members, with plans to reach 300, showing that adequate resourcing is possible when there is political will
A two-dimensional approach to child protection (addressing both content through safety codes and systems through the DSA) provides a comprehensive framework for protecting minors online
Electoral integrity requires coordinated cross-border regulatory response, with tools like election guidelines, candidate support packs, and pre-election scenario planning proving effective
Ireland’s media regulator has established a reputation for acting with strength and integrity, contrasting positively with other Irish regulators’ handling of tech companies
Risk-based prioritization considering platform reach, user demographics, and enforcement history across Europe is essential for effective resource allocation
The urgency of online harms means regulators cannot wait decades for frameworks to mature as was possible with telecommunications regulation
Resolutions and action items
Ireland will continue developing risk-based prioritization mechanisms to focus regulatory efforts more precisely on high-harm areas
Coordinated enforcement actions against adult sites below the 45 million user threshold will be pursued by Digital Services Coordinators across member states
Ireland plans to extend educational resources to primary age children and run cinema-based awareness campaigns for parents during summer
Research will be conducted to evaluate the effectiveness of candidate support packs provided during elections
The regulator committed to ongoing participation in international networks like the Global Online Safety Regulators Network to share best practices
Unresolved issues
How to effectively coordinate with member states that may have weak digital service coordinators or rule of law backsliding issues
The challenge of ensuring policy recommendations are actually enforceable and implementable from a legal/regulatory perspective
Implementation details for the new media privilege rules for journalistic content in platform content moderation, which platforms don’t yet know how to execute
Long-term sustainability of regulatory independence and integrity as political contexts change
Whether current regulatory frameworks will prove sufficient or if additional legislative measures will be needed as harms evolve
How to balance the need for quick action on severe harms with the time required for regulatory frameworks to mature and prove effective
Suggested compromises
Using existing Digital Services Act and Digital Markets Act frameworks as a first step while learning and developing new approaches, rather than waiting for perfect solutions
Leveraging the European Commission and Digital Services Board as protective mechanisms against potential regulatory capture or weakness in individual member states
Combining enforcement actions with educational initiatives and media literacy efforts rather than relying solely on punitive measures
Accepting that regulatory frameworks will need to evolve iteratively rather than expecting comprehensive solutions immediately, while still acting urgently on severe harms
Thought provoking comments
How do you make your choices? Do you have enough resources? What is your policy on prioritising with the resources that you have given all the challenges that there are?
Speaker
Audience member (first questioner)
Reason
This question cuts to the heart of regulatory effectiveness by addressing the fundamental challenge of resource allocation in digital regulation. It acknowledges the enormous scope of the regulator’s mandate while recognizing the practical limitations that could undermine their effectiveness.
Impact
This question shifted the discussion from theoretical regulatory frameworks to practical implementation challenges. It prompted Evans to reveal concrete details about organizational growth (from 40 to 200+ people), resource allocation strategies, and the collaborative nature of European digital regulation. It also led him to discuss their risk-based prioritization approach, adding depth to understanding how modern digital regulation actually works in practice.
Very often in policy discussions there are calls for interventions that are maybe not necessarily enforceable… do you feel that you with your expertise and specific background knowledge on how complex these issues are that this is also taken up on the other side of the spectrum in the policy and regulatory development?
Speaker
OECD policy worker
Reason
This comment highlights a critical disconnect between policy aspirations and regulatory reality – the gap between what policymakers want to achieve and what regulators can actually enforce. It introduces the concept of implementability as a key constraint in digital governance.
Impact
This question prompted Evans to provide historical context about internet regulation evolution, explaining how the ‘hands-off’ approach gradually gave way to targeted legislation like the DSA and DMA. It led to a deeper discussion about the tension between the urgency of addressing digital harms and the time needed to develop mature regulatory frameworks, comparing it to the 25-year evolution of telecommunications regulation.
Amongst other digital and human rights activists around Europe, Coimisiún na Meán has already a reputation of acting with strength and integrity as a regulator, which has been completely lacking in our data protection regulator and how Ireland deals with tax and the tech cos… what are you doing and can you do to ensure that you continue to act with that strength, with that integrity, with that moral courage?
Speaker
Maria Farrell
Reason
This comment is particularly insightful because it directly addresses Ireland’s controversial reputation as a ‘regulatory haven’ for tech companies while acknowledging a positive counter-narrative. It raises the fundamental question of regulatory capture and independence in a jurisdiction where major tech companies are headquartered.
Impact
This comment created a moment of validation for the regulator while simultaneously challenging them to maintain their independence. It shifted the conversation toward questions of institutional integrity and political pressure. Evans’ response about having a ‘North Star’ strategy and political support revealed important insights about how regulatory independence can be maintained even in challenging political-economic contexts.
How would you relate as an Irish media commission to member states with perhaps the last strong digital service coordinator or making decisions that you do not agree with from a rule of law perspective?
Speaker
Audience member (final questioner)
Reason
This question introduces the complex geopolitical dimension of digital regulation within the EU, specifically addressing how democratic backsliding in some member states could affect the coordinated regulatory approach that the DSA depends upon. It highlights potential systemic vulnerabilities in the network-based regulatory model.
Impact
While Evans’ response was brief, this question opened up discussion of the safeguards built into the DSA framework, particularly the role of the European Commission and Digital Services Board in maintaining standards across member states. It highlighted the tension between national sovereignty in regulation and the need for consistent enforcement of digital rights across the EU.
Overall assessment
These key comments transformed what could have been a straightforward regulatory presentation into a nuanced exploration of the practical, political, and systemic challenges facing digital governance. The questions moved the discussion from describing regulatory frameworks to examining their real-world implementation challenges, resource constraints, political pressures, and systemic vulnerabilities. Maria Farrell’s comment was particularly impactful in acknowledging Ireland’s unique position and the regulator’s emerging reputation, while the OECD questioner’s focus on enforceability highlighted the gap between policy ambition and regulatory reality. Together, these interventions created a more honest and comprehensive picture of digital regulation as an evolving, resource-constrained, and politically complex endeavor rather than a purely technical exercise.
Follow-up questions
How effective was the candidate pack initiative in supporting politicians’ safe participation in public life during elections?
Speaker
John Evans
Explanation
John Evans mentioned they are conducting research to find out exactly how the candidate pack helped politicians when targeted online, indicating this is an ongoing area of investigation to measure impact and improve future initiatives
How will platforms implement Article 28 of the Digital Services Act regarding protection of minors’ safety, security and privacy?
Speaker
John Evans
Explanation
John Evans noted that guidance from the European Commission on implementing this article will emerge later in the year, suggesting this is an area requiring further clarification and research on practical implementation
How to approach the new media privileges rule for journalistic content on platform content moderation from a regulatory enforcement perspective?
Speaker
OECD audience member
Explanation
The audience member noted that platforms don’t know how to implement this rule yet and asked about the regulatory body’s preparedness, indicating this is an area requiring further research and policy development
How can policy discussions better integrate regulatory expertise to ensure proposed interventions are actually enforceable?
Speaker
OECD audience member
Explanation
The audience member raised concerns about policy calls for interventions that may not be legally or regulatorily enforceable, suggesting need for better interlinkage between policy development and regulatory expertise
How can regulators maintain strength and integrity in the face of changing political contexts while ensuring independent regulation?
Speaker
Maria Farrell
Explanation
This question addresses the critical challenge of maintaining regulatory independence and moral courage over time, which is essential for effective platform regulation
How should digital services coordinators handle situations involving member states with rule of law backsliding?
Speaker
Audience member
Explanation
This question addresses potential conflicts within the European regulatory network when some member states may have compromised rule of law standards, requiring research into governance mechanisms and accountability measures
What are the best practices and different regulatory approaches being tested globally for online safety regulation?
Speaker
John Evans
Explanation
John Evans mentioned the value of learning from other regulators like Australia’s eSafety Commission and Ofcom through the Global Online Safety Regulators Network, indicating ongoing research into comparative regulatory approaches
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Related event
