WS #152 a Competition Rights Approach to Digital Markets
25 Jun 2025 15:45h - 17:00h
WS #152 a Competition Rights Approach to Digital Markets
Session at a glance
Summary
This workshop at the Internet Governance Forum explored the intersection between competition law and human rights in digital markets, focusing on the European Union’s Digital Markets Act (DMA) and its implications for the Global South. Bruno Carballa from the European Commission explained that the DMA, implemented in 2022-2023, targets “gatekeeper” platforms like Google, Meta, Apple, Amazon, Microsoft, and ByteDance that meet specific size and market position criteria. The regulation imposes obligations such as allowing businesses to operate outside platforms, preventing data combination without consent, ensuring interoperability, and prohibiting self-preferencing of services.
Camila Leite Contri from Brazil’s IDEC argued that economic concentration directly impacts human rights, particularly freedom of expression, citing examples like zero-rating practices that limit platform choices for lower-income users and Google’s interference in Brazil’s fake news bill debate. She emphasized the need to connect competition law with human rights discourse, noting that monopolistic power translates into political influence that affects democratic participation. Hannah Taieb from Speedio discussed how market concentration in digital platforms undermines media diversity and editorial authority, leading to filter bubbles and misinformation spread through opaque algorithms.
The panelists addressed questions about creating alternatives to dominant platforms, particularly in the Global South, suggesting solutions like public digital infrastructure (citing Brazil’s PIX payment system), open-source alternatives, and bolder regulatory approaches. They emphasized the importance of interoperability, algorithm transparency, and unbundling of platform services. The discussion concluded with calls for more interdisciplinary dialogue between human rights advocates and competition law experts to develop comprehensive approaches to platform regulation that protect both economic competition and fundamental rights.
Keypoints
## Major Discussion Points:
– **Digital Markets Act (DMA) and Economic Regulation**: The European Union’s DMA targets “gatekeeper” platforms (Google, Apple, Meta, etc.) with specific obligations to prevent abuse of market power, including allowing third-party payment systems, data portability, app uninstallation rights, and preventing self-preferencing. While primarily economic in focus, these regulations have indirect human rights implications.
– **Connection Between Economic Concentration and Human Rights**: Panelists explored how monopolistic control of digital platforms directly impacts fundamental rights like freedom of expression, access to information, and democratic participation. Examples included Brazil’s zero-rating practices that limit platform choice for lower-income users and Google’s interference in political discourse during Brazil’s “fake news bill” debate.
– **Global South Challenges and Alternatives**: Discussion of how developing countries like Brazil face infrastructure limitations and dependency on Big Tech platforms, with exploration of potential solutions including public digital infrastructure (like Brazil’s PIX payment system), open-source alternatives, and stronger competition authority powers to consider human rights impacts.
– **Business Models and Ethical Technology**: Examination of how current advertising-dependent models contribute to harmful content amplification and filter bubbles, with proposals for more ethical algorithms, transparent recommendation systems, and alternative monetization models that don’t rely solely on data exploitation and targeted advertising.
– **Regulatory Integration and Cross-Disciplinary Collaboration**: Strong emphasis on the need to bridge competition law, human rights advocacy, and technology policy, with calls for bolder regulatory approaches that consider human rights impacts in antitrust decisions and greater cooperation between different regulatory bodies.
## Overall Purpose:
The discussion aimed to explore the intersection between competition law/antitrust regulation and human rights protection in digital markets, specifically examining how economic concentration of power among Big Tech platforms affects fundamental rights and what regulatory and business model alternatives could better protect both competition and human rights.
## Overall Tone:
The discussion maintained a collaborative and constructive tone throughout, with participants expressing genuine enthusiasm for cross-disciplinary dialogue. The tone was academic yet accessible, with speakers acknowledging their different professional backgrounds while finding common ground. There was an underlying sense of urgency about addressing Big Tech dominance, but the approach remained solution-oriented rather than purely critical. The atmosphere became increasingly optimistic as panelists and audience members, particularly from Brazil’s youth delegation, engaged with concrete examples and potential pathways forward.
Speakers
**Speakers from the provided list:**
– **Raquel da Cruz Lima** – Human rights lawyer from Brazil, works at Article 19 Brazil and South America (human rights organization dedicated to protection of freedom of expression)
– **Bruno Carballa SmichoWSki** – Research officer at the European Commission’s Joint Research Centre, economist working in the digital markets research team
– **Camila Leite Contri** – Representative of IDEC (Institute for Consumer Defense) in Brazil, has background in competition law
– **Hannah Taieb** – Leading business development for Speedio (now part of Mediagenix), specializes in commercialization of recommendation algorithms, has background in consultancy for public institutions on ethical algorithm implementation
– **Jacques Peglinger** – From business side, teaches digital regulation at a Dutch university
– **Audience** – Multiple audience members including Laura (youth program participant from Brazil), João (youth delegation from Brazil), and Beatriz (assistant professor in law at University of Sussex, UK, teaches Internet law regulation and platform regulation)
**Additional speakers:**
– **Juan David Gutiérrez** – Mentioned as joining online but did not participate in the recorded discussion
Full session report
# Workshop Report: Competition Law and Human Rights in Digital Markets
## Introduction and Context
This workshop at the Internet Governance Forum brought together experts to explore the intersection between competition law and human rights in digital markets, with particular focus on the European Union’s Digital Markets Act (DMA) and implications for the Global South. The panel included Bruno Carballa Smichowski, a research officer at the European Commission’s Joint Research Centre (speaking in personal capacity); Camila Leite Contri from Brazil’s Institute for Consumer Defence (IDEC); Hannah Taieb from Speedio (now part of Mediagenix), specializing in recommendation algorithms for entertainment; and Raquel da Cruz Lima, a human rights lawyer from Article 19 Brazil.
The discussion featured active audience participation, including questions from Brazilian youth delegation members Laura and João, academic expert Beatriz specializing in internet law regulation, and Jacques Peglinger, who teaches digital regulation at a Dutch university.
## The Digital Markets Act: Framework and Implementation
Bruno Carballa Smichowski provided insights into the DMA, emphasizing that his views were personal and not official European Commission positions. The DMA entered into force in November 2022 and became applicable in May 2023, targeting “gatekeeper” platforms that meet specific criteria including revenues or market capitalization above 7.5 billion euros.
Six companies have been designated as gatekeepers: Google (Alphabet), Meta, Apple, Amazon, Microsoft, and ByteDance. These platforms face obligations including allowing businesses to operate outside their platforms, enabling data portability, permitting users to uninstall pre-installed applications, ensuring interoperability with third-party services, and prohibiting self-preferencing.
Enforcement has begun with cases against Apple and Meta. Bruno mentioned fines of 500 million for Apple and 200 million for Meta, though noted these figures were preliminary. The DMA coordinates with the Digital Services Act (DSA) through shared procedures while maintaining distinct objectives.
Bruno acknowledged questions about whether generative AI should be included in DMA categories, indicating this remains an evolving area of regulatory consideration.
## Economic Concentration and Human Rights: The Brazilian Perspective
Camila Leite Contri presented research on how digital platforms function as gatekeepers of human rights, with economic concentration directly impacting fundamental freedoms. Her research on how lower socioeconomic classes use the internet in Brazil revealed concerning patterns.
She highlighted zero-rating practices where users with prepaid mobile data plans can access certain platforms—primarily WhatsApp, Facebook, and TikTok—without consuming data allowances. For citizens with limited data budgets (typically four gigabytes monthly), this artificially constrains platform choices, directly impacting freedom of expression and access to diverse information.
Camila provided a striking example of Google’s political interference during Brazil’s “fake news bill” debate. During the crucial voting week, Google displayed messages on its main website asking “How can the bill, the fake news bill, worsen your internet?” Additionally, searches about the fake news bill returned sponsored links promoting opposition to the “censorship bill.”
She argued for integrating human rights considerations directly into competition law analysis rather than treating them as separate domains, calling for competition authorities to adopt bolder approaches including potential company breakups.
## Media Diversity and Algorithmic Transparency
Hannah Taieb focused on how market concentration undermines media diversity, highlighting the shift from information consumption within defined editorial contexts to algorithmic feeds where logic is invisible and data collection intrusive. She noted that the creator economy and influencer culture increasingly dominate over trained journalism.
Hannah demonstrated that technical solutions exist for ethical approaches to content recommendation that maintain user experience while respecting privacy and providing transparency. She advocated for algorithm pluralism and interoperability as essential for diverse information ecosystems, emphasizing that users should understand recommendation logic and have choices about content curation.
## Public Alternatives and Digital Infrastructure
Bruno highlighted Brazil’s PIX payment system as an exemplary model of public digital infrastructure challenging private platform monopolies. PIX succeeded through public investment combined with mandatory interoperability requirements, forcing all financial institutions to integrate with the system.
This example provided evidence that alternatives to dominant platforms can achieve widespread adoption when properly designed and regulated. Key success factors included government backing, universal interoperability requirements, and user convenience matching existing alternatives.
The discussion extended to other potential public infrastructure areas, including cloud services and social media alternatives, with Bruno suggesting open-source solutions combined with public procurement requirements could promote alternatives across various sectors.
## Cross-Disciplinary Dialogue and Coordination
A recurring theme was the artificial separation between regulatory domains. Camila noted feeling isolated discussing human rights in competition law circles and equally isolated discussing market concentration in human rights spaces.
Raquel reinforced this from a constitutional perspective, arguing that states have duties to consider human rights implications in all regulatory decisions, including competition matters. This provides legal foundation for integrating human rights analysis into economic regulation.
The discussion explored practical coordination mechanisms, including shared procedures between regulatory frameworks and empowering civil society organizations to participate more effectively in market-oriented discussions.
## Global South Perspectives and Challenges
Audience members from Brazil’s youth delegation raised questions about how countries with limited technological infrastructure can develop competitive alternatives. Laura asked about Global South protagonism in platform regulation, while João questioned user incentives for switching platforms despite network effects.
The discussion revealed both challenges and opportunities. Infrastructure limitations create barriers, but examples like PIX demonstrate that strategic public investment with smart regulation can create successful alternatives in developing economies.
Panelists suggested focusing on digital public infrastructure development, supporting open-source alternatives through public procurement, requiring interoperability to reduce platform lock-in, and developing regulatory approaches considering human rights impacts in competition decisions.
## Regulatory Coordination and Implementation
Beatriz, an academic expert in internet law regulation, asked about coordination between different regulatory frameworks. The discussion revealed both opportunities and challenges in aligning competition law, data protection, and human rights approaches.
Bruno discussed coordination between DMA and DSA implementation, while panelists acknowledged that different jurisdictions may need adapted solutions based on legal systems, institutional capacities, and political contexts.
The conversation also addressed Brazil’s “revolving door” problem in cloud services, where officials move between regulatory positions and private companies, potentially creating conflicts of interest in infrastructure decisions.
## Key Areas of Agreement and Tension
Panelists demonstrated consensus that economic concentration has direct human rights implications and that interoperability is crucial for breaking platform monopolies. However, disagreements emerged regarding whether regulatory approaches should maintain primarily economic objectives with indirect human rights benefits, or directly integrate human rights considerations into competition law.
There were also differences regarding intervention intensity, with some emphasizing targeted approaches like the DMA while others advocated for more aggressive interventions including company breakups.
## Questions and Future Directions
The discussion identified ongoing challenges including network effects that maintain user loyalty to dominant platforms despite alternatives, infrastructure development needs in Global South countries, and sustainable funding mechanisms for ethical technology alternatives.
An audience question about generative AI regulation highlighted emerging challenges as technology evolves beyond current regulatory frameworks.
## Conclusion
The workshop successfully demonstrated the value of interdisciplinary dialogue in addressing platform governance challenges. By integrating perspectives from regulation, civil society, business, and academia, the discussion revealed interconnections between economic concentration and human rights while identifying potential solutions.
Concrete examples—from Brazil’s zero-rating practices to Google’s political interference to PIX’s success—grounded theoretical concepts in practical experience. The consensus between different stakeholder perspectives suggests maturing understanding of these challenges that could facilitate more integrated policy approaches.
Raquel concluded by referencing Article 19’s policy paper “Taming the Big Tech,” emphasizing the continued need for coordinated approaches addressing both economic competition and fundamental rights in digital markets.
Session transcript
Raquel da Cruz Lima: Hi, hello, everyone. It’s a great pleasure to welcome you all to this workshop called Competition Rights Approach Digital Markets. Before we start, I’d like to invite anyone who would like to be with us here at the roundtable. You would have mics, so it makes it easier to make questions in the end of the session. So please be free to sit here with us like André. I would like to thank especially our panelists for being here, first Camila and Hannah, who are here in person, and also Bruno and Juan David, who will be joining us online. Before I give the floor to our panelists, let me introduce myself. My name is Raquel da Cruz Lima. I’m a human rights lawyer from Brazil, and I work at Article 19 Brazil and South America, a human rights organization dedicated to the protection of freedom of expression. Under the perspective of freedom of expression, diversity and pluralism are vital. For that reason, human rights bodies, such as the Inter-American Court of Human Rights, have long stated that the means by which freedom of expression is exercised are owned by monopolies, then the circulation of ideas and opinions is limited. Therefore, in order to protect freedom of expression and access to information, states have a duty to prevent excessive concentration. the objectives of the DMA, how it proposes to address the issue of concentration of power in digital markets and if the protection of freedom of expression and other human rights was one of the goals pursued by the DMA. So, Bruno, I would appreciate if you could start by introducing yourselves and let us know a bit more about the DMA. Thank you so much for being here.
Bruno Carballa SmichoWSki: Hello, thank you very much for the invitation. Hello everyone, I’m Bruno Carballa. I’m a research officer at the European Commission’s Joint Research Centre, which is an institution of the European Commission that does research to support evidence-based policy, including the DMA. I’m an economist working in the digital markets research team. So, I will try to walk you through in a couple of minutes the spirit of the DMA, to explain the Digital Markets Act, also called DMA, and how I think it links to the issues, the broader issues that were being discussed today. So, perhaps a very small disclaimer about what the DMA is. Oh, can you hear me? Yeah, you’re back. Ah, okay, sorry. So, as a first clarification on the Digital Markets Act, or DMA, it’s a regulation that has, let’s see, a pure economic… objective, which is precisely to reduce the market power of the so-called gatekeepers. I will come in a second to which of these platforms are called gatekeepers. But this obviously has indirectly an effect on the capacity of these platforms to abuse their power in non-economic ways. This is more of the goal of discussion of this forum, such as all sorts of human rights violations. That said, obviously there are other regulations that have a specific target that are non-economic and have more to do with human rights. So I’m thinking specifically of the DSA, Digital Services Act, which is a kind of a companding sister regulation that aims to curb issues such as disinformation or discrimination and so on. So that said, I’m going to try to walk you through what is the spirit of the DMA, how does it work and what are the expected effects of this new regulation. So first thing in terms of timeline, it’s quite a recent regulation for the legal time of application. It entered into force in November 2022 and it became really applicable in terms of articles in May 2023. So we’re talking about two years now of the DMA, which given the length of this type of cases is quite young. We’re starting to see, I will talk about this in a couple of minutes, the first decisions on how companies are or are not following the rules of the DMA. So the idea of the DMA, as I said before, is to curb the power of the so-called gatekeeper platforms. So for that first, it defines what the gatekeeper platforms are. So for different criteria, and there are both quantitative and qualitative. So the first one is it has to be big platforms, so it is not to regulate every single platform on the internet which would be practically impossible, but those that do have a much stronger impact. So in that sense, the first criterion is that these platforms have to have at least 7.5 billion in the last three years of revenues or market cap, so to show they have a big economic power in terms of size. They have to be part of one of the so-called core platform services, so these are services that are deemed to be particularly important in the digital space, so online intermediation could be any sort of marketplace, search engines, social networks, video sharing platforms like YouTube, number independent communication services are basically messaging apps, virtual assistants, web browsers, operating systems, cloud computing, and online advertising, and this is the first list that is going to be revised, and one of the discussions for example that is going on right now is should we include generative AI under a new category such as GPT and so on, or does it actually fit to be into one of the categories which is the search engines. So these are basically platforms that are in critical areas that are important in terms of size and therefore potential impact, and that have been in a durable position so on, so it means that these criterias have been met in at least the last three years, meaning that it’s not just by chance that they had a lot of users seasonality-wise, and then so there are these means these platforms have been there having power for at least three years. So and what is the aim of this, why this new regulation? Well the main reason is that the existing regulation and competition law which is aimed to sanction anti-competitive behaviors usually It actually has, for many different technical reasons, a difficulty in being applied to certain conducts that are typical of these platforms, and comes in too slowly. So the idea is to regulate it ex-ante. So before any abuse of power can take place in the economic sense of the word, try to create new rules, new obligations to these platforms so they cannot abuse their position of power. So once the platforms are designated as gatekeepers, and here you have the usual big platforms that you all have in mind. The designated ones are Alphabet, so that’s the Google conglomerate. We’re talking about Amazon, Apple, Bytance, which has TikTok, Meta, where all the Facebook family products, Instagram, so on, and Microsoft. So we’re talking here about, let’s say, the main platforms that have the most powers in the internet. So these gatekeepers that have been already designated because they meet these criterias that I was mentioning before, have new obligations they didn’t have until two years ago. So these obligations are of different ways of trying to make the platforms not abuse their power. So the first one is they have to allow business to offer the products and services outside of a platform. So there’s been many cases where, for example, an app by a small developer or even big developers have the issue that they have to go through the app store, which takes a big cut, usually around 30%, and they cannot promote in any way a link to, say, pay outside of the platform or allow the platform system and take business outside. So the platform is kind of abusing the fact that it’s precisely the gatekeeper between people who have phones and people who want to reach apps, because the only way to reach apps is through their store, and they’re using that to, say, extract all this value from the apps. And that, in turn, obviously will end up not benefiting consumers, because then apps are going to be… And then there’s other provisions about usage related to the access to data, usually the business users, meaning, for example, an app or meaning a seller on Amazon, usually don’t have access to the data about the people they interact with, which they could use in their daily life. So, the new obligation is that they have to be given access to this data to better compete. Another third obligation is to allow users to uninstall pre-installed apps. So, you see many platforms have used the fact that they run also the operative system of a phone, for example, to pre-install apps you cannot uninstall. So, Safari in Apple is a classic example. So, you end up using their browser because they put it there and you cannot take it out. So, now they’re obliged to allow you to take it out. So, again, there could be more competition and new browsers can come in. And if you want, for example, a privacy preserving browser like DuckDuckGo, you can download it and even uninstall the other one, which should not be predetermined to take you always to remain within the ecosystem of the dominant platform. Another obligation is about refraining from combining personal data from different platforms. So, for example, Google obviously has a lot of different platforms about the same users. So, they know where you go looking for food in maps, they know what you look for in a search service, and they can combine that personal data. So, if you don’t consent to that, they should not be able to do that. And not to use the power of merging data from many markets, so nobody can challenge them in any market. Because obviously it’s very difficult to replicate the fact that a few gatekeepers have access to multiple sides of our lives as users, or think of this even for business users. People who use a Microsoft suite, who have like the cloud and the operative system. So they collect all the data and it’s very difficult for someone from a non-GitKeeper platform to replicate that. And in the same spirit, another important obligation is that of ensuring interoperability for third-party software. So that’s a classical problem that a lot of complementers in the ecosystem are facing is that because they’re not interoperable, they can’t add services on their own. Finally, about the advertising market. Same about having more transparency about the data and the pricing of advertising, because those markets are very concentrated into basically Google and Facebook, so Alphabet and Meta, that they control pretty much all the value chain of online advertising. So again, it’s very difficult to compete with that, which in turn leads to higher prices for advertising and eventually to higher prices for we consumers of anything that uses advertising online. Finally, other obligations include allowing app developers to use third-party payment systems. So in the same spirit of letting them doing business outside of the Gatekeeper or not going through the Gatekeeper, they should be able to pay with something else that, for example, Google Pay or Apple Pay. And the so-called anti-self-referencing obligation, which is when a platform also is a seller in the platform or has another product, has an obligation not to push users to use it. So for example, when you used to look for something on Google, you might find that always the link goes to Google Maps. Now in Europe, it doesn’t do anymore because of this. Or Amazon that tries to allegedly push its own products, so you end up buying the Amazon basics in the platform, not independent buyers, independent sellers, sorry. So these type of things now are being scrutinized by the Commission to make sure the platforms, again, do not use. their power as gatekeepers to make the competition less fair and therefore to harm consumers because of less competition. Then lastly, the obligations about not preventing users from switching between apps by making it difficult from a technical point of view if you want to change provider and inform the commission about any potential acquisition that might impact this. So these are the obligations as you see they’re all aimed at again regulating platforms are big they have a lot of impact in specific markets are critical and trying to make them new rules, obligations are like asymmetric in the sense that only this big platform have these obligations and not the small ones so as they cannot abuse their power and harm the competitors and consumers. And where are we with this now? Again this is young, it’s only two years but in these two years we already have four cases open, three against Apple and one against Meveta so basically against Apple we have one against this anti-steering or anti-self-preferencing this idea that the platform might benefit its own products using its power as a gatekeeper and so far Apple has been hit a fine of 500 million so far so this is all public information you can check the decisions and the whole process. Also Apple has been open a case in terms of the issue of non-compliance with the choice screen so the idea that to give users other options you should give a choice screen for example when you want to open a link do you want to use Apple’s browser or do you want to use also other browsers? And the commission found they’re not being compliant with the way they’re implementing this because they might be trying to trick users into still using their own browser despite the fact of a choice screen. Another case that’s being open against Apple, again, the third one, is about the specification decisions on Connected App. This is a more technical one, but it’s about basically how Apple is implementing the interoperability, and the case is about the commission saying you’re not really making this interoperable as it should be to make it easier for any third party who wants to add products to your ecosystem. And finally, the last case ongoing is one against Meta. It is basically challenging this consent or pay model, the idea that you cannot use a product unless you consent to what are deemed abusive terms in terms of access to your data and use of personal data, because basically they’re saying, well, Meta is still not offering a free equivalent and less data-intensive alternative. So there basically is either you use my product for free and we abuse, let’s say, the data we collect and we exploit from you as a user, or you have to pay to me. So the commission is saying there has to be some middle point. And so far, well, Meta has been fined 200 million on this, and this is obviously in appeal. But as you see, in two years, we have already four cases open and more will probably be open or scrutinized in the future. And hopefully if the application of the DMA is effective, we should be seeing digital markets in which the dominant platforms should have less capacity to abuse their gatekeeping power, which should, again, benefit consumers and in turn, give them less power to abuse consumers or users in other non-economic ways. Thank you very much.
Raquel da Cruz Lima: Thank you so much Bruno for bringing this great perspective from the DMA. You were quite clear that the objectives were really related to the economic field, but we heard some concepts that are really close to human rights, such as to prevent that the platforms not abuse their power. And also the concern about not harming consumers. So I’d like to turn to the Global South and ask you, Camila, if hearing about this idea of gatekeepers that the DMA had in mind, do you think that maybe from the perspective of Brazil or the Global South, we could consider the major digital communication platforms as gatekeepers of human rights? And do you think there is a link between economic concentration and fundamental rights? And also, Camila, if you could start introducing yourself, I’d appreciate it. Thank you.
Camila Leite Contri: Of course. Thank you so much, Raquel. Article 19. It’s a pleasure to be here in this panel with you. Short answer, yes, I’ll go for it. But it’s a pleasure to be here representing IDEC. IDEC is the Institute for Consumer Defense. It’s based in Brazil and has more than 35 years of experience in protecting consumers through advocacy, campaigning and strategic litigation, including against big techs. I have a background in competition law as well, so disclaimer. But I always felt kind of isolated in both fields, both in competition law, where you’re talking about human rights in the digital sphere, and both in civil society, in the human rights side, talking about, talking within the language of market. So I think that my personal, I would say my personal goal, my personal willing, is try to connect the both fields to answer this question and to have more people breaking this barrier to understand that, yes, monopoly competition issues are key to human rights and we should analyze them together. But the reality is that, for example, this is, I believe, the only panel in IGF that we are talking about competition or anti-monopoly, and I don’t say this is a personal criticism, but the need that we have to discuss this more. And I think this is a consequence that we still have this pervading narrative that in the market we should to understand that competition authorities currently have the attribution and the power to consider this kind of consequences, maybe in a mediate way, so indirect way, but monopolies, the concentration of economic power, are foundational to most of the issues that we see, not the only ones, but mostly of the problems. We currently have a society that is tech-mediated, our citizenship is tech-mediated, and I can personally talk about Brazil, sharing some experiences on how Brazilians deal with internet and especially lower classes. IDEC has a research on how lower classes uses the internet, and in Brazil we still have this zero rating practices, in which people that use prepaid mobile data, so people that have data caps, they mostly use the platform, the applications that don’t spend their mobile cap. So, we currently have people that have, for example, per month, four gigabytes, and WhatsApp, Facebook, and TikTok don’t spend internet, so why would someone have an incentive to use another platform, and how this is important to how the debate is developed on how people express themselves. So we currently have an issue of concentration on the discourse, the possibilities on how we can express ourselves, and I think this is one good example. The second thing is… The way that we use platforms, it’s beyond not being a choice that we currently have. The platforms are profiting for, sorry I forgot the word in English, but for political disputes, extremism, and this kind of discourse unfortunately is how it affects freedom of expression. But meanwhile platforms are gaining, are profiting from that. So this is very concerning. And the third point that for me it’s an example on how the economic power also translates into other kinds of power and indirectly or directly affect human rights is how they interfere also in the political dimension on the political discourse. And for that I would like to bring a concrete example on how one specific big tech influenced the public discourse in Brazil. But first let me get back to what Bruno said about the DMA. In Brazil we are currently discussing not a DMA but a possibility of developing a new regulation, not a regulation but a way to improve the attribution of the competition authority to deal with digital markets. And although human rights is not embedded in there, some examples on how the DMA could be interpreted as having a good consequence in human rights could be also imported in Brazil and adapted to Brazil. The limitations on data sharing and the prohibition on the payer consent, so the prohibition on people having to decide whether they have their rights respected or would they have to pay for it. It’s a good example. The second thing is that creating possibilities for users to choose the platforms that they use could also mean having platforms that have moderation rules that are less restrictive in freedom of expression and could also promote other rights and, for example, limit misogynistic speech. And the third example, and that’s… That’s why that is when I will enter in the concrete example in Brazil is about limitations on self-preferencing. So Bruno mentioned the example that Google cannot, when you search for a place, cannot in Europe move directly to Google Maps because this is a way of self-preferencing another Google service. And in Brazil, we had an interesting case that was presented before the competition authority CADE that was about maybe a political self-preferencing. So during the week of the votation of the Brazilian DSA, the Brazilian Digital Services Act, which was called publicly the fake news bill, Google put in the main website. So below the search, they put, how can the bill, the fake news bill can worsen your internet? Sorry, the fake news bill can increase the confusion on what is true and a lie in Brazil. And this phrase directed to a blog post saying that this bill can worsen the internet as you know and could change the internet for worse. And when would you search for a fake news bill? The first link that would appear would be a promoted, a sponsored link by Google saying no to censorship bill. So how can we have a free space of debating when the, I wouldn’t say the only search platform, but basically the only search platform that people use in practice put this and change the whole debate. Is this a free way to interact in platforms? So trying to move more on what we can do about that, we have this, I believe that we have a common sense that this power is exercised in different ways and economic power can be translated into political power and this has consequences on human rights. So what we can do, maybe as civil society, empower ourselves to also talk about this market language. We need not to… Juan David Gutiérrez, Juan David Gutiérrez, Hannah Taieb The title of this work is the unbundling of content curation and hosting services. The proposal that Article 19 has related to timing the tax. I’m happy to continue talking. Thank you so much.
Raquel da Cruz Lima: Thanks Camila. This was so great and spoke so close to my heart because I’m only a human rights lawyer. That’s only my only background and for me everything is new now, discussing competition law and antitrust and I think what is really powerful of being here at the IGF is exactly that idea of bringing people together from different sectors. The opportunity to talk to business and state, civil society and academia. This is exactly what we need and I don’t think that we need to go back and forget our backgrounds but exactly put them together and make it more powerful. And I think that something you said was quite important. The idea that business and also other authorities, they are all obliged by the constitution and in the human rights field, especially in the international community, We have long discussed the duty of a control or conventionality by every member of the state. So whatever their conduct is, they have the duty to take into consideration the international treaties that were ratified by the states. So why human rights not taken into consideration when competition is discussed and also when other actors are, especially in this field, as you said, when tech is mediating the access to every kind of right, we need human rights to be taken into consideration and we from human rights backgrounds also need to learn more from business, from competition law and so on. So with that, I’d like to turn to you, Hannah, because as Camila said, that there are discussions in the private sector that have impact on our rights. And I think you have now a great experience to share with us with what we can expect in an environment with more competition, as Bruno brought, what kind of business opportunities are there that can emerge and how those opportunities may take us to business that are more aligned to human rights goals and standards. And also, please, if you can introduce yourself as a beginner. Thank you so much.
Hannah Taieb: Thank you. Thank you very much for having me here. So I’m leading business development for Speedio, which is now part of Mediagenix, a company that is specialized in the commercialization of recommendation algorithm that we want ethical, controllable and accessible. We are specializing in the entertainment sector, working with players like Claro in Brazil, Sky, DirecTV, Latam, Canal+, Globo, TV5, and I’m also, I have a background also in consultancy for the public institutions on how to implement more ethical algorithm globally. So as you were saying. Indeed, private companies, whether solution providers like we are, but also traditional media outlets or even social media platforms, while pursuing, of course, profit and protecting their own interests, also bear the responsibility not only to respect the law, but also setting the standards for ethical and transparent AI, especially in media and entertainment. Our influence, shall I say, goes beyond business operations. They help shape the very technologies through which millions of people engage with culture and information every day. So the way those companies, including us, we are doing business, directly affects our rights, but I mean our rights as civil society, to information, free speech, media freedom and privacy, which are, I think, the human rights we are discussing today. So one trait we can observe is a raise of distribution of information without anchors. So it means that it brings media fragmentations and a decline of editorial authorities. So as we all know, content and information now circulates primarily through social media, which is leading to, indeed, a monopoly of the big tech on the distribution of information. So the presentation of this content is governed by algorithms that remain opaque and inaccessible to most users. And how personalization is done in that case, as you know, largely by collecting private data with no or little regard for actual transparency or contextual understanding. So this lack of contextual framing, it contributes to the spread of misinformation. It weakens the audience’s ability to detect bias and undermines the visibility of sources. So, you know, as earlier generations were encountering information with very well-defined context in… In environments such as newspapers, like reading the New York Times or watching the BBC, it implies assumptions about style, tone, political orientation, which in other words we can call context. But today, many young users, the only information they encounter comes through feeds, where logic is invisible and intrusive. Medias become more ambient and anonymous. The user is exposed, but not oriented by an editorial line. I think here we are all familiar with the theory of the filter bubble and how the impact on democracy is not to prove for the past 10 years. It affects public discourse and political life and access to shared truth. From another angle, traditional media organizations are increasingly burned by economic pressures and difficulty to achieve profitability. It undermines their position and their capacity to provide quality information. Many of the traditional media outlets that we are navigating towards today are financed by either wealthy owners or public funding. Even media platforms that in the past were a tribe on the attention economy, relying on advertising, are today facing financial difficulties because everyone and the advertisers’ budgets are going towards individuals, and by individuals I mean influencers, and the creator economy has become a dominant force. The data is here, and at the same time, the branded content of the past decade has blurred the line between advertising and journalism. I think these shifts raise important concerns about access to reliable information, and especially in the monopoly of social media we have today. Of course, introduction of various international regulations that are designed to address this issue. So the question that we might raise today is, is this an opportunity to rethink the business models in support of human rights? The individuals are gaining traction of their institution as established media outlets face mounting difficulties in reinventing themselves and preserving their relevance. Of course, even if this transformation is not necessarily negative, because social networks have allowed new voices to emerge that are less dominant, and it has given visibility to creators who produce originals and sometimes very relevant work. However, with lower entry barriers, the distinction between influence and expertise becomes blurred. And in many cases, creators with little journalistic background, like, you know, come in more attention, both economically and in terms of audience than professionals that are trained to verify and contextualize information. I’m not even mentioning the rise of Gen AI that will, of course, add more and more, you know, non-verified content to the already massive ocean of content that we have today. So, to summarize, facing this abundance of content, it becomes essential to imagine new economic models that are ethical, content-centric, and less dependent on advertising revenue, and designed to restore clarity and control to both users and producers. But there is some promising direction, of course. First, for like the traditional media outlets, calling that in opposition to social media. When it comes to that, what shall be suggested, we believe, is that we should push a voluntary, proprietary platform and a sovereign algorithm. What it means is that when it comes to preserving access to information, one strategy is to support or to develop independent platforms that blend algorithm, curation, and editorial supervision.
Raquel da Cruz Lima: Perfect, that’s so great and so powerful choosing to prioritize this idea to keep in mind. And also something else you mentioned, Hannah, I think it’s important not only to digital markets but to the whole media in general. I heard a lot yesterday and the day before about trust. And I think when you mentioned that the users have to know the logic, it must be explained to them what is there. At least in Brazil, that also applies to traditional media because often the positions of the traditional news outlets are not quite clear. They do not make also transparent to us users why you read from that perspective different stories. So I think transparency is also always a key issue in building trust and enabling freedom of expression and access to information. Right now we should have our fourth panelist, but he couldn’t join us. So I would open the floor now for any questions or interventions you’d like to make. We have around… 12 minutes so it’s actually quite good time to hear you online and also here you can talk from the mic or come to the round table and please introduce yourselves when you’re making your question.
Audience: It’s working yes can you it’s a silent section we can hear you i’m laura i’m in the youth program i’m from brazil too and i loved what we discussed here your panel was amazing but i wanted to know in in a competition scenario how the global south could increase the protagonism when we don’t have the infrastructure to have our own means like you have a monopolization from google from meta and we can start our own social media our own platforms we can have it but as you said google is the main used how do we get some protagonism in this thank you.
Raquel da Cruz Lima: You can go.
Audience: Thank you for this workshop it was really interesting i’m joão i’m from brazil too i’m in the youth delegation i would like to ask especially thinking on the dma in the european union we see some changes for example in the app store in the ios in general like there’s alternative marketplaces and when we look at the european union we see alternatives are created but i would like to ask how How to overcome, like, obstacles regarding incentives to users, because although alternatives might be available, like, in opposition to big techs, how can incentives to use big tech services can be diminished or overcome in a context where it’s sometimes easier to use big tech services or platforms, and although these regulations, especially in the European Union, try to deconstruct that and try to change the institutional arrangement, but how can in practice people feel incentivized to not use big tech platforms and services? Thank you.
Raquel da Cruz Lima: Thank you. You can go.
Jacques Peglinger: My name is Jacques Peglinger. I’m… A bit louder, please. My name… My name is… Thank you. Good. You can go. No? Yes, you can go, please. My name is Jacques Peglinger. I’m from business side, but I’m also teaching digital regulation at a Dutch university. So my question is primarily to the first speaker, who elaborated very well about the Digital Market Act from the EU, but there what we see is, of course, in Europe, these very fragmented local markets. The DMA basically addresses European-wide big platforms, but what about the local champions? And there the question then, how is Brazil handling local champions, and are there or they are just really nationwide platforms? Thank you.
Raquel da Cruz Lima: . I’m going to pass it over to Beatriz.
Audience: Thank you. Is it okay?
Raquel da Cruz Lima: Yes, please.
Audience: Hi, my name is Beatriz, I’m also from Brazil, but I’m currently a assistant professor in law at the University of Sussex in the UK and one of the things I teach is Internet law regulation, platform regulation. I’m interested to hear from the panel, what do you see in terms of the perspective of the government and the regulatory perspective as well, the need to empower organisations to join the conversation, human rights organisations, people involved in platform governance more broadly into this kind of more economic-oriented or market-oriented aspects of regulation, but I’m curious to hear from the panel, and maybe Bruno, but I know members of the panel have also been studying that. Let’s start with you and then, Ritja. Let’s take a look at informing the adjudicators on the market-oriented conversation, but I do believe that connecting platforms is really a engagement and more of a holistic conversation about how to regulate platforms, not only from this market or economic perspective, but also from the perspective of data protection. So, let’s start with you, Ritja. You mentioned data protection and the GDPR, and that is some kind of, at least in Europe, relevant case law about how considerations of data protection could inform and kind of delimit it, the bearing between what is kind of acceptable and what becomes anti-competitive behavior. Do you see the perspective of issues that have the advantage on equal status or on equitable status as well, or is it just a misconception of what methods are used to help to draw the boundaries in terms of loss of dominance in competition? Are there limits to the decision-making process? Thank you. And I think, I mean, more broadly, this would also help to kind of counter some narratives that we see that there’s a conflict between the two. When the digital markets kind of bill in the US was being proposed, there was a debate among some academics that breaking up kind of the digital public sphere into small players would be harder to control in terms of, I don’t know, hate speech or platform regulation models. So like this relationship between market structure and how to hold them accountable is not an easy one to tackle. But I would say that it’s important for regulators as well to have this perspective of how the things joined together. So, yeah, I’m curious to hear from you. How do you see that?
Raquel da Cruz Lima: Great. I think we don’t have any other questions. So I’ll just add to that before I give the floor back to our panelists. The first question I would add is for all of you. If you see any priorities in terms of regulations now to increase competition and also a market more respective to human rights. And the second question, I think, is more directed to Hannah and Bruno. Hannah mentioned a bit about advertising. And I would like to know if, from an European perspective, you see any changes already because we also have concentration in the market of advertisement. So do you see any changes in breaking a bit also the market of advertising and making it also more aligned to human rights? So I think we start with Bruno. We have around seven minutes for each of you to answer the questions and also make your closing remarks. So Bruno, you can start, please.
Bruno Carballa SmichoWSki: Thank you very much. Lots of good questions that I’m going to try to squeeze in answers to in a short time. Again, this will be pretty much my own personal opinion. and a commission official one. So perhaps with the first question about alternatives, my personal view is that there is no magic one-size-fits-all solution to this. And especially, let’s say, for countries like Brazil, I am myself Argentinian as well. So I understand, let’s say, where you’re coming from. So I think different ingredients can be added to alternatives. One is for certain more infrastructure-like parts of the digital world. Some public alternatives, like Brazil, I think it’s quite exemplary with PIX in this case, can counterbalance market power in a very strong way. But of course, these have to come with a proper regulation that makes them a real alternative, like it happened in the case of PIX, actually. We’re discussing that this time in a workshop in Rio with people from CAGI and Ministry of Finance. And when I asked them, why do you think it was a success, this public alternative? And they told me, well, basically because we forced all the companies, digital and non-digital, to be interoperable with PIX. Obviously, the technical solution had to be good, useful, practical. And I think the big tech are already very good at doing this. And the public sector could replicate it from a technical point of view. But it came up also with a good regulation that made it a real competitive constraint on any other service. So people can use whatever service they want, but the fact that this one exists that is interoperable with all of them, gives much less power to any platform that could have been imposed as the gatekeeper of digital payments. So that’s one part of the solution. I see in some areas, not in all, I don’t think you can do this for social media in a very effective way, for example, certain public infrastructure layers. So we can think of the same in terms of cloud, in terms of certain parts of the digital chain. Then, let’s say as the government itself, at least for critical things. I think open source alternatives are to be promoted. For instance, to me it should be clear that government offices should be using open source by default and this could be transmitted to public procurement, for example, requirements. And then for those things that I think from an economic point of view don’t make much sense to be, let’s say, publicly owned or with good regulation. I think that’s what we’re experimenting with. The DMA perhaps was the first one, but I see Brazil is making nice advances in that respect and regulation of all sorts. I’m talking not only about the economic one and here I think it’s a trial and error that is going on throughout the world. Perhaps the European Union was the first one, but we’re seeing the UK. A very similar legislation was already put in place and Australia is discussing the same and many other countries are following. So I think here we’ll have a nice laboratory of what works and what doesn’t. And in that way, perhaps being second movers, it’s an advantage for countries like Brazil, because they can already learn from the mistakes we will make for sure in the European Union. About local champions, I think you’re right. Indeed, the DMA explicitly targets, not by design, contrary to the DSA, Digital Services Act doesn’t require that the platforms are active in many local markets, in many countries, to be more specific. But obviously, given the thresholds of size, it ends up always being, and given the particularities of digital markets, it ends up being usually European-wide type of platforms or even international ones. That doesn’t mean there couldn’t be any fostering of local champions. I’ve heard there’s a parallel discussion going on, as you may have seen in the world, but also in the European Union about industrial policy. and Digital Industrial Policies. For example, for AI, there’s a battery of new legislation. The pipelines and strategic plans are already in about how you could foster those champions in the AI chain. So, I see those two as complementary types of regulation. Industrial policy on one side, and regulation of the existing big ones. Again, it is more of a practicality point of view. Why not at each market level? Because it takes a lot of time and effort to regulate. So, you’ve got to aim for those that have the higher impact, which end up being those that are very international. Then Beatriz’s question on regulation. Nice to see you again, Beatriz. So, about the dialogue between types of regulation. It’s actually something that is happening from the inside already. For example, between DMA and DSA. So, DSA is about systemic risks like misinformation. Some platforms are obviously regulated by both regulations. And I think, indeed, there’s a dialogue in two ways. One, in terms of procedures. For example, the DSA, in terms of procedures, is very similar to competition cases. Actually, colleagues from DigiComp are helping colleagues in DigiConnect in how to, from a procedure point of view, carry out these investigations, although the object is very different. It’s not economic, but about fundamental rights. The procedure is very similar. So, I think there’s a lot to learn from the longer experience of competition law. And vice versa, in terms of, say, in both ways. I foresee and already see, let’s say, from some colleagues’ work around the whole, in terms of the methods. For example, self-preferencing, which we mentioned already in two presentations, is a classic example. Because self-preferencing, let’s say, the way you could monitor this from an algorithmic point of view, could be enhanced by the techniques the colleagues are doing. the DSA are developing to monitor harm to users. So I see a cross-fertilization between those types of regulations in both methods and procedures. And then at the more, let’s say, political level, coming back to the first question, I think there’s a lot of everything to gain between different jurisdictions in learning from the different institutional designs. Obviously in the Commission, they ended up saying, DSA should be one legislation, DMA should be another. At the beginning, they were thinking of making actually the same big regulation. Then they said, let’s go for one that is economic and one that is fundamental rights, although they overlap in the types of platforms they are going to regulate. But that’s an institutional design choice. It could be the case that some other jurisdiction decides to put them on the same umbrella. It wouldn’t be necessarily bad. And I think there’s everything to gain in dialogue between jurisdictions about the institutional design and what worked, what didn’t work, what can we learn from previous mistakes and previous successes. Advertising, to finish on this, I think it’s still too soon to tell, because the decisions in advertising are still ongoing. These are highly technical matters and they require time, just like competition law. I mean, in my personal opinion, I think obviously it was too late in the sense that all the chain of advertising, as put in a very nice report by the CMA, when they did a market study on this, is highly concentrated in two firms. And that’s a problem. But I mean, at this stage, what we can only expect is to do good regulation, I think. And if anything, in terms of harm, let’s say they’re non-economic, I think here is where the DSA, the Digital Service Act, should kick in. In the sense that if those target advertisements, for instance, promote eating disorders to minors, because that’s a…
Raquel da Cruz Lima: Thank you, Bruno. Just a small footnote for everyone here who is not from Brazil, or so familiar with Brazil. Bruno mentioned an important experience from Brazil that is PIX, it’s a pain method, P-I-X. Also for our reporters not from Brazil, just to mention. So now, Camila, you have the floor, you have seven minutes.
Camila Leite Contri: Thank you. First on the question, uniting the questions both from Laura and from Juan. Happy to see you people interested in this issue, come join us in these discussions. I think you brought a good example on how network effects work in practice. So we are in platforms because our friends use these platforms. We are in these platforms because we have content created for us. How can we let go of this if everyone, sorry, I’m hearing myself. If everyone is there, so it seems like a chicken or egg problem. Laura was saying, how can we move to alternatives and we don’t have these alternatives, but everyone is on the other platforms. So yeah, this is challenging, but having alternatives can at least make people think more on the possibilities that they could have. Otherwise, we still continue in the situation that we are enclosed in this kind of platform, where there is a literacy, maybe digital literacy work as well that we have to do. But on the questions related to alternatives, I think there are some things that we can do right now and some alternatives that we can still promote in a longer term. The first thing that competition authorities could do is to have bolder theories of harm. And using this competition law jargon, theories of harm is basically how competition authorities judge the competition case. And breaking up companies. And as we see that they have an unmeasurable impact in our lives, maybe the solution is that they didn’t have to be that big. So that’s why I praise the solution presented by Article 19 on unbundling, for example, Host Creation Services. Another concrete example is that I mentioned the judgment on data protection and competition law in the EU. So in this Facebook judgment, it was a decision made by the Germany Competition Authority, and this case went to the European Court of Justice. And the judgment of the European Court of Justice was about if the competition authority in Germany, could interpret a data protection violation as a competition breach. And they gave good parameters on how authorities can consider a breach in another law inside their attribution. So in this case, the solution was that the competition authority would have to analyze if the data protection authority had a similar decision. If yes, they could not depart from it, but it could have their own competition law conclusions. If there isn’t a similar decision by data protection authority, they could consult them and seek cooperation. And if they didn’t present any objection, they could continue with their own case. So yes, we are thinking about data protection and competition. Why can’t we think this about human rights? Why can’t we understand the human rights impact and bring this into competition authorities? But this demands bold public servants. So Bruno, I know that you are in the European Commission, I’m really happy that we can share this panel and I see your availability to discuss this, to have these discussions with us and I hope other authorities, such as the Brazilian ones, have this same openness. And I do believe so.
Raquel da Cruz Lima: Thank you, Camila. Thank you Camila, you were so precise with your time. Now, Hannah Taieb, if you can make your answers and closing remarks, please.
Hannah Taieb: Sure. So first thing to jump on what Camila was saying, I think that we, technically speaking, from just an algorithm standpoint and a personalization standpoint, doing personalization as it’s done on social media, not in terms of advertising, but really in terms of user experience, meaning having a personalized feed on whatever social media that we are using. It’s absolutely possible to do it while respecting GDPR and still having a very good user experience. The fact that we… like that many uh big tech and big social media platform have integrated the fact that you need to use um anything any sensitive data meaning like um you know gender age whatever any demographic in order to have like a um a good user experience it’s actually not true it’s it relies on that for advertising perspective but not for user experience so i think that from from a private sector point of view i think that maybe regulation on that could be like a bit stronger and and it will still probably not harm um some part of the business especially if we are looking for a more virtuous way of monetizing media anyway um and i don’t think like relying highly on advertisements the way it is today of course it it brings other problems like such as the openness of platform because let’s let’s be realistic the fact that most of those platforms are free because they’re relying on on advertising so not not that subscription or shared participants is an ideal solution but just that it’s it’s a piss for reflection for sure then the question i think is also what are we looking for are we looking for informations are we looking for um on interacting with our friends and to and maybe the platforms where we are you know trying to have like the best experience combined in one is probably like not viable in the future and so for instance uh when talking about like brazil not having like um its own uh infrastructure and i think there is also layers between choosing um edible us or google for big media companies like i’m talking about global for instance like which is i’ll say a local champion right um in the The way Globo decides to push their information to their users on GloboPlay, they could rely on the Google algorithm, or they could rely on the proprietary algorithm, which is brought either by small vendors like us, but also an algorithm that they will develop themselves. But for that, of course, you need subvention, either for small, more ethical tech vendors, and also for the company itself, the media outlet itself. I think that there is also room to incentivize private companies to do more open source, because today, honestly, it’s very complicated to be able to create innovative open source solutions that scale for smaller vendors and vendors that care about ethics. To do that, I think there should be an incentive in terms of regulation or in terms of subvention. I don’t have the answers. I’m not a regulator myself, but for now, it’s actually just a matter of willing, and I think it’s not enough to encourage that. This is, I think, what could be interesting for innovation at scale. And then in terms of advertising, of course, the market consolidates, and I think we are still watching the decline of cookies and still looking forward to a new way of doing contextual advertising, meaning having also a proper way of… And technology will help that to explain also why an ad is suggested to the user from the same principle that why information is pushed to the user or why content is pushed to the user. But still, I think today it’s not enough. and as long as the model will be relying on advertising, it will be very complicated to fight against that kind of lobbying from the advertiser and without killing the market and the media market. I think we still have a lot of stuff to do before that. So hopefully, yes, I’m pro-stronger regulation on that part, I think.
Raquel da Cruz Lima: Thank you. I’ll give the floor back to Camila. If you can have one minute for each of you to make a closing remark. You can start, Camila.
Camila Leite Contri: Thank you. Just one thing that I wanted to react related to what Hannah said and Bruno is related to clouds. And in Brazil, we are still very dependent on big tech clouds. And this is also a matter of data sovereignty. So Brazil should focus on this and also pay attention on the revolving doors. Because, for example, in the health public sector in Brazil, we had an issue that the person that was working in the Brazilian government went to the cloud company and then came back to the government, which can bring some concerns as well on how we can create alternatives. So my final point would be having this kind of discussion in Brazil about funding some alternatives about digital public infrastructures, for example, and how we can create alternatives from small companies, but also from the public sector beyond regulation, of course. But it was a pleasure to be here. I’m very excited and happy to continue these kinds of discussions. Thank you so much.
Raquel da Cruz Lima: Thank you, Camila. Bruno, would you like to say some final words?
Bruno Carballa SmichoWSki: Not just that. I would like to repeat the words about the interest in dialogue across both disciplines and jurisdictions. I’m very happy to continue. Thank you very much, and thank you, everyone. Thank you everyone for your attention and articles and for the invitation. Thank you.
Raquel da Cruz Lima: Thank you. Hannah?
Hannah Taieb: Just to add something is maybe like forcing a little bit more on interoperability and algorithm pluralism I think would be like great in order to have like a better distribution for information and the technical solutions again are here. It’s not a matter of technicalities. It’s not a matter of open APIs or however you call it. It’s a matter of regulation and goodwilling for the big tech to do it. So if we have to count on that, I think, you know.
Raquel da Cruz Lima: Perfect. Brilliant. And just to finish, I would like to invite you all to access our policy paper by article 19. It’s called Taming the Big Tech. We have a Portuguese version for everyone. It comes from Brazil that is hosting our website. And as Camila briefly mentioned, we explore the idea of unbundling in social media the service of hosting and curation and that also would be possible having more interoperability and would help have incentives for users to leave the big platforms and also for business. There could be other models of business working with curation and offering other kinds of standards for how we interact with our friends, the content that we see, have more transparency. So you can check it out on our website. And thank you all so much. I think we can end with this idea of being a bit radical, a bit more bold. Maybe we can tackle the power of the big tech and have a more diverse Internet. Thank you all so much for joining us today.
Bruno Carballa SmichoWSki
Speech speed
157 words per minute
Speech length
3648 words
Speech time
1387 seconds
DMA targets gatekeepers with economic objectives to reduce market power and prevent abuse
Explanation
The Digital Markets Act is a regulation with pure economic objectives aimed at reducing the market power of so-called gatekeepers. While it has indirect effects on platforms’ capacity to abuse power in non-economic ways like human rights violations, its primary focus is economic regulation of dominant platforms.
Evidence
DMA entered into force in November 2022, became applicable in May 2023, and targets platforms with at least 7.5 billion in revenues or market cap over three years, operating in core platform services like search engines, social networks, messaging apps, etc.
Major discussion point
Digital Markets Act (DMA) and Competition Regulation
Topics
Economic | Legal and regulatory
Agreed with
– Camila Leite Contri
– Raquel da Cruz Lima
Agreed on
Economic concentration directly impacts human rights and requires integrated regulatory approaches
Disagreed with
– Camila Leite Contri
– Raquel da Cruz Lima
Disagreed on
Primary regulatory approach – economic vs. human rights focus
DMA creates new obligations for platforms including allowing business outside platforms, data access, app uninstallation, and interoperability
Explanation
The DMA imposes asymmetric obligations only on large gatekeeper platforms to prevent abuse of their dominant position. These include allowing businesses to operate outside the platform ecosystem, providing data access to business users, and ensuring technical interoperability with third-party services.
Evidence
Examples include allowing app developers to promote payment outside app stores (avoiding 30% cuts), mandatory data sharing with business users, allowing uninstallation of pre-installed apps like Safari, prohibiting combination of personal data across platforms without consent, and ensuring interoperability for third-party software
Major discussion point
Digital Markets Act (DMA) and Competition Regulation
Topics
Economic | Legal and regulatory | Human rights
Agreed with
– Hannah Taieb
Agreed on
Interoperability is crucial for creating competitive alternatives to dominant platforms
Disagreed with
– Camila Leite Contri
Disagreed on
Scope of regulatory intervention – targeted vs. comprehensive approach
Four enforcement cases already opened against Apple and Meta with significant fines imposed
Explanation
Despite being only two years old, the DMA has already resulted in active enforcement with multiple cases opened against major platforms. The European Commission has imposed substantial financial penalties for non-compliance with DMA obligations.
Evidence
Three cases against Apple (anti-steering/self-preferencing with 500 million fine, choice screen non-compliance, interoperability issues) and one against Meta (consent or pay model with 200 million fine)
Major discussion point
Digital Markets Act (DMA) and Competition Regulation
Topics
Economic | Legal and regulatory
Brazil’s PIX payment system demonstrates successful public alternative through mandatory interoperability requirements
Explanation
PIX serves as an exemplary case of how public digital infrastructure can effectively counterbalance market power when combined with proper regulation. The success came from forcing all companies to be interoperable with the public payment system, creating a real competitive constraint.
Evidence
PIX forced all digital and non-digital companies to be interoperable, providing a technical solution that was good, useful, and practical while being backed by regulation that made it a real alternative to private payment gatekeepers
Major discussion point
Alternative Platforms and Market Solutions
Topics
Economic | Infrastructure | Development
Agreed with
– Hannah Taieb
Agreed on
Interoperability is crucial for creating competitive alternatives to dominant platforms
Disagreed with
– Hannah Taieb
Disagreed on
Role of public vs. private solutions in addressing platform dominance
Open source solutions and public procurement requirements can promote alternatives to big tech dominance
Explanation
Governments can promote alternatives to big tech dominance by defaulting to open source solutions in government offices and extending these requirements to public procurement. This approach can help reduce dependency on proprietary platforms in critical infrastructure.
Evidence
Government offices should use open source by default and transmit this to public procurement requirements
Major discussion point
Alternative Platforms and Market Solutions
Topics
Economic | Legal and regulatory | Infrastructure
Agreed with
– Hannah Taieb
Agreed on
Technical solutions exist for ethical platform alternatives but require regulatory support
Cross-fertilization between DMA and DSA regulations through shared procedures and monitoring techniques
Explanation
The DMA and DSA regulations complement each other through shared procedural approaches and cross-learning between economic and fundamental rights enforcement. Competition law experience helps inform DSA procedures, while DSA algorithmic monitoring techniques can enhance DMA enforcement.
Evidence
DSA procedures are similar to competition cases, with DigiComp colleagues helping DigiConnect colleagues in investigations. Self-preferencing monitoring can be enhanced by DSA techniques for monitoring user harm
Major discussion point
Regulatory Coordination and Enforcement
Topics
Legal and regulatory | Human rights
Agreed with
– Camila Leite Contri
Agreed on
Cross-regulatory coordination between different legal frameworks is necessary
Camila Leite Contri
Speech speed
143 words per minute
Speech length
1800 words
Speech time
752 seconds
Brazil is discussing similar digital market regulations adapted to local context
Explanation
Brazil is currently discussing improvements to competition authority powers to deal with digital markets, though not exactly replicating the DMA. The discussion focuses on adapting successful DMA elements like data sharing limitations and consent-or-pay prohibitions to the Brazilian context.
Evidence
Brazil is developing ways to improve competition authority attribution for digital markets, potentially importing DMA concepts like limitations on data sharing and prohibition on pay-or-consent models
Major discussion point
Digital Markets Act (DMA) and Competition Regulation
Topics
Economic | Legal and regulatory
Digital platforms act as gatekeepers of human rights, with economic concentration directly impacting fundamental rights
Explanation
Major digital communication platforms function as gatekeepers of human rights because economic concentration in digital markets directly affects fundamental rights access. The concentration of economic power translates into control over how people exercise their rights in tech-mediated society.
Evidence
Society is tech-mediated, citizenship is tech-mediated, and monopolies/concentration of economic power are foundational to most digital rights issues
Major discussion point
Connection Between Economic Power and Human Rights
Topics
Human rights | Economic
Agreed with
– Bruno Carballa SmichoWSki
– Raquel da Cruz Lima
Agreed on
Economic concentration directly impacts human rights and requires integrated regulatory approaches
Disagreed with
– Bruno Carballa SmichoWSki
– Raquel da Cruz Lima
Disagreed on
Primary regulatory approach – economic vs. human rights focus
Zero rating practices in Brazil limit platform choice for lower-income users, concentrating discourse possibilities
Explanation
Zero rating practices in Brazil create artificial incentives for lower-income users to use only certain platforms, effectively concentrating discourse and limiting freedom of expression. Users with data caps naturally gravitate toward platforms that don’t consume their limited data allowance.
Evidence
IDEC research shows people with prepaid mobile data (4GB monthly caps) primarily use WhatsApp, Facebook, and TikTok because these don’t spend their mobile data, creating disincentives to use alternative platforms
Major discussion point
Connection Between Economic Power and Human Rights
Topics
Human rights | Economic | Development
Google’s political interference during Brazil’s fake news bill debate demonstrates how economic power translates to political influence
Explanation
Google’s intervention during Brazil’s Digital Services Act debate exemplifies how dominant platforms use their gatekeeper position to influence political discourse. By placing anti-bill messaging on their main search page and promoting sponsored links, Google shaped public debate on legislation that would regulate their own conduct.
Evidence
During the fake news bill vote, Google placed ‘How can the fake news bill worsen your internet?’ on their main page, directed users to anti-bill blog posts, and promoted ‘no to censorship bill’ sponsored links in search results
Major discussion point
Connection Between Economic Power and Human Rights
Topics
Human rights | Economic | Legal and regulatory
Human rights considerations should be integrated into competition law analysis and enforcement
Explanation
Competition authorities should adopt bolder theories of harm that incorporate human rights impacts, similar to how data protection violations can inform competition cases. This requires breaking down silos between different regulatory fields and recognizing their interconnected nature.
Evidence
EU Facebook case where German Competition Authority could consider data protection violations as competition breaches, with parameters for cooperation between different regulatory authorities
Major discussion point
Connection Between Economic Power and Human Rights
Topics
Human rights | Legal and regulatory | Economic
Agreed with
– Bruno Carballa SmichoWSki
Agreed on
Cross-regulatory coordination between different legal frameworks is necessary
Competition authorities need bolder theories of harm and should consider breaking up dominant companies
Explanation
Given the unmeasurable impact of big tech platforms on people’s lives, competition authorities should develop more aggressive enforcement approaches, including company breakups. Current theories of harm are insufficient to address the scale of platform dominance and its societal effects.
Evidence
Platforms have unmeasurable impact on lives, and breaking up companies could be a solution; Article 19’s unbundling proposal for Host Creation Services as an example
Major discussion point
Regulatory Coordination and Enforcement
Topics
Legal and regulatory | Economic
Disagreed with
– Bruno Carballa SmichoWSki
Disagreed on
Scope of regulatory intervention – targeted vs. comprehensive approach
Data protection violations can inform competition law enforcement as demonstrated in EU Facebook case
Explanation
The European Court of Justice established parameters for how competition authorities can consider data protection breaches within their competition analysis. This creates a framework for cross-regulatory enforcement that could extend to other areas like human rights.
Evidence
German Competition Authority case against Facebook went to ECJ, which ruled that competition authorities can interpret data protection violations as competition breaches, with specific cooperation procedures between different regulatory authorities
Major discussion point
Regulatory Coordination and Enforcement
Topics
Legal and regulatory | Human rights | Economic
Agreed with
– Bruno Carballa SmichoWSki
Agreed on
Cross-regulatory coordination between different legal frameworks is necessary
Hannah Taieb
Speech speed
136 words per minute
Speech length
1676 words
Speech time
736 seconds
Private companies can develop ethical, controllable recommendation algorithms while maintaining good user experience
Explanation
Companies like Speedio demonstrate that it’s possible to create recommendation algorithms that are ethical, controllable, and accessible while still providing good user experience. This challenges the narrative that effective personalization requires extensive data collection or unethical practices.
Evidence
Speedio specializes in commercialization of ethical recommendation algorithms, working with players like Claro Brazil, Sky, DirecTV, Canal+, Globo, TV5
Major discussion point
Alternative Platforms and Market Solutions
Topics
Economic | Human rights | Sociocultural
Agreed with
– Bruno Carballa SmichoWSki
Agreed on
Technical solutions exist for ethical platform alternatives but require regulatory support
Disagreed with
– Bruno Carballa SmichoWSki
Disagreed on
Role of public vs. private solutions in addressing platform dominance
Algorithm opacity and lack of contextual framing contributes to misinformation and undermines source visibility
Explanation
The shift from traditional media with clear editorial context to algorithm-driven feeds without transparent logic creates an environment where users cannot properly evaluate information sources. This lack of contextual understanding weakens users’ ability to detect bias and contributes to misinformation spread.
Evidence
Earlier generations encountered information with well-defined context (New York Times, BBC with known editorial lines), while today’s users get information through feeds where logic is invisible and intrusive, making media more ambient and anonymous
Major discussion point
Media, Information, and Algorithmic Transparency
Topics
Sociocultural | Human rights | Legal and regulatory
Traditional media faces economic pressures while creator economy and influencers gain dominance over trained journalists
Explanation
Traditional media organizations struggle with profitability as advertising budgets shift toward individual creators and influencers. This transformation raises concerns about access to reliable information, as creators with little journalistic background often receive more attention than trained professionals who verify and contextualize information.
Evidence
Traditional media increasingly financed by wealthy owners or public funding; advertisers’ budgets going toward influencers; creator economy becoming dominant force; branded content blurring lines between advertising and journalism
Major discussion point
Media, Information, and Algorithmic Transparency
Topics
Economic | Sociocultural | Human rights
Technical solutions exist for personalization without sensitive data collection, but stronger regulation needed
Explanation
From a technical standpoint, effective personalization and good user experience can be achieved while respecting GDPR and without using sensitive demographic data. The current reliance on extensive data collection is driven by advertising needs rather than user experience requirements.
Evidence
Personalization for user experience (not advertising) can be done while respecting GDPR; big tech integration of sensitive data for good user experience is not technically necessary – it’s for advertising purposes
Major discussion point
Alternative Platforms and Market Solutions
Topics
Human rights | Legal and regulatory | Economic
Agreed with
– Bruno Carballa SmichoWSki
Agreed on
Technical solutions exist for ethical platform alternatives but require regulatory support
Transparency in algorithmic logic essential for building user trust and enabling informed choices
Explanation
Users need to understand the logic behind algorithmic recommendations to make informed decisions about their media consumption. This transparency is crucial for building trust and enabling users to choose platforms that align with their values and needs.
Major discussion point
Media, Information, and Algorithmic Transparency
Topics
Human rights | Sociocultural | Legal and regulatory
Interoperability and algorithm pluralism needed for better information distribution
Explanation
Forcing greater interoperability and promoting algorithm pluralism would improve information distribution and reduce platform monopolization. The technical solutions exist, but implementation requires regulatory intervention and willingness from big tech companies to comply.
Evidence
Technical solutions exist for interoperability and open APIs; it’s not a matter of technicalities but of regulation and good willing from big tech
Major discussion point
Media, Information, and Algorithmic Transparency
Topics
Legal and regulatory | Infrastructure | Human rights
Agreed with
– Bruno Carballa SmichoWSki
Agreed on
Interoperability is crucial for creating competitive alternatives to dominant platforms
Jacques Peglinger
Speech speed
124 words per minute
Speech length
118 words
Speech time
57 seconds
Local champions and national platforms need different regulatory approaches than international gatekeepers
Explanation
The DMA addresses European-wide big platforms but doesn’t adequately address local champions that may dominate specific national markets. This raises questions about how different jurisdictions should handle platforms that are dominant locally but don’t meet international gatekeeper thresholds.
Evidence
DMA targets European-wide platforms due to size thresholds, but fragmented local markets may have local champions that need different regulatory treatment
Major discussion point
Digital Markets Act (DMA) and Competition Regulation
Topics
Economic | Legal and regulatory
Raquel da Cruz Lima
Speech speed
156 words per minute
Speech length
1450 words
Speech time
555 seconds
States have constitutional duty to consider human rights in all regulatory decisions including competition matters
Explanation
All state authorities, including competition regulators, have a constitutional obligation to consider international human rights treaties in their decision-making processes. This duty of conventionality control means human rights should be integrated into competition law analysis and enforcement.
Evidence
International community has long discussed duty of control or conventionality by every member of the state; whatever their conduct, they have duty to consider international treaties ratified by states
Major discussion point
Connection Between Economic Power and Human Rights
Topics
Human rights | Legal and regulatory
Agreed with
– Bruno Carballa SmichoWSki
– Camila Leite Contri
Agreed on
Economic concentration directly impacts human rights and requires integrated regulatory approaches
Disagreed with
– Bruno Carballa SmichoWSki
– Camila Leite Contri
Disagreed on
Primary regulatory approach – economic vs. human rights focus
Audience
Speech speed
139 words per minute
Speech length
680 words
Speech time
291 seconds
Infrastructure development and funding for digital public alternatives essential for Global South countries
Explanation
Global South countries face challenges in developing protagonism in digital markets due to lack of infrastructure to create their own platforms and services. Even when alternatives exist, the dominance of platforms like Google makes it difficult to gain traction without adequate infrastructure support.
Evidence
Question from Laura about how Global South can increase protagonism when lacking infrastructure for own social media/platforms while facing monopolization from Google and Meta
Major discussion point
Alternative Platforms and Market Solutions
Topics
Development | Infrastructure | Economic
Users need incentives and alternatives to reduce dependence on big tech platforms despite network effects
Explanation
Even when regulations create alternatives to big tech services, users face practical challenges in switching due to network effects and convenience factors. Overcoming these obstacles requires addressing both institutional arrangements and practical incentives for users to adopt alternative platforms.
Evidence
Question from João about how to overcome obstacles regarding incentives to users, noting that although alternatives might be available, it’s sometimes easier to use big tech services
Major discussion point
Alternative Platforms and Market Solutions
Topics
Economic | Sociocultural | Human rights
Civil society organizations need empowerment to participate in market-oriented regulatory discussions
Explanation
There’s a need to empower human rights organizations and civil society groups to engage meaningfully in market-oriented aspects of platform regulation. This requires bridging the gap between economic regulation and human rights advocacy to create more holistic platform governance approaches.
Evidence
Question from Beatriz about empowering organizations to join economic-oriented regulation conversations and connecting platform governance perspectives beyond just market/economic focus
Major discussion point
Regulatory Coordination and Enforcement
Topics
Human rights | Legal and regulatory | Economic
Dialogue between different regulatory disciplines and jurisdictions essential for effective platform governance
Explanation
Effective platform regulation requires coordination between different regulatory approaches (data protection, competition, human rights) and learning between jurisdictions. This interdisciplinary dialogue is crucial for addressing the complex challenges posed by platform dominance.
Evidence
Discussion about connecting data protection considerations with competition law, and learning between different jurisdictional approaches to platform regulation
Major discussion point
Regulatory Coordination and Enforcement
Topics
Legal and regulatory | Human rights | Economic
Agreements
Agreement points
Economic concentration directly impacts human rights and requires integrated regulatory approaches
Speakers
– Bruno Carballa SmichoWSki
– Camila Leite Contri
– Raquel da Cruz Lima
Arguments
DMA targets gatekeepers with economic objectives to reduce market power and prevent abuse
Digital platforms act as gatekeepers of human rights, with economic concentration directly impacting fundamental rights
States have constitutional duty to consider human rights in all regulatory decisions including competition matters
Summary
All speakers agree that economic power concentration in digital markets has direct implications for human rights, and that regulatory approaches should acknowledge this connection even when primarily focused on economic objectives
Topics
Human rights | Economic | Legal and regulatory
Technical solutions exist for ethical platform alternatives but require regulatory support
Speakers
– Bruno Carballa SmichoWSki
– Hannah Taieb
Arguments
Open source solutions and public procurement requirements can promote alternatives to big tech dominance
Private companies can develop ethical, controllable recommendation algorithms while maintaining good user experience
Technical solutions exist for personalization without sensitive data collection, but stronger regulation needed
Summary
Both speakers acknowledge that technical solutions for more ethical platform alternatives already exist, but successful implementation requires supportive regulatory frameworks and policy interventions
Topics
Economic | Legal and regulatory | Infrastructure
Interoperability is crucial for creating competitive alternatives to dominant platforms
Speakers
– Bruno Carballa SmichoWSki
– Hannah Taieb
Arguments
Brazil’s PIX payment system demonstrates successful public alternative through mandatory interoperability requirements
DMA creates new obligations for platforms including allowing business outside platforms, data access, app uninstallation, and interoperability
Interoperability and algorithm pluralism needed for better information distribution
Summary
Both speakers emphasize that mandatory interoperability requirements are essential for breaking platform monopolies and creating viable alternatives, as demonstrated by successful cases like Brazil’s PIX system
Topics
Economic | Infrastructure | Legal and regulatory
Cross-regulatory coordination between different legal frameworks is necessary
Speakers
– Bruno Carballa SmichoWSki
– Camila Leite Contri
Arguments
Cross-fertilization between DMA and DSA regulations through shared procedures and monitoring techniques
Human rights considerations should be integrated into competition law analysis and enforcement
Data protection violations can inform competition law enforcement as demonstrated in EU Facebook case
Summary
Both speakers advocate for breaking down regulatory silos and creating coordination mechanisms between different legal frameworks (competition, data protection, human rights) to address platform dominance comprehensively
Topics
Legal and regulatory | Human rights | Economic
Similar viewpoints
Current regulatory approaches are insufficient and need to be more aggressive, while also ensuring broader participation from civil society in shaping these approaches
Speakers
– Camila Leite Contri
– Audience
Arguments
Competition authorities need bolder theories of harm and should consider breaking up dominant companies
Civil society organizations need empowerment to participate in market-oriented regulatory discussions
Topics
Legal and regulatory | Economic | Human rights
Platform opacity and algorithmic control enable manipulation of information and political discourse, demonstrating how technical design choices have political consequences
Speakers
– Hannah Taieb
– Camila Leite Contri
Arguments
Algorithm opacity and lack of contextual framing contributes to misinformation and undermines source visibility
Google’s political interference during Brazil’s fake news bill debate demonstrates how economic power translates to political influence
Topics
Human rights | Sociocultural | Economic
Public digital infrastructure can effectively compete with private platforms when properly designed and regulated, and this approach is particularly important for Global South development
Speakers
– Bruno Carballa SmichoWSki
– Audience
Arguments
Brazil’s PIX payment system demonstrates successful public alternative through mandatory interoperability requirements
Infrastructure development and funding for digital public alternatives essential for Global South countries
Topics
Development | Infrastructure | Economic
Unexpected consensus
Need for bolder regulatory enforcement including potential company breakups
Speakers
– Bruno Carballa SmichoWSki
– Camila Leite Contri
Arguments
Four enforcement cases already opened against Apple and Meta with significant fines imposed
Competition authorities need bolder theories of harm and should consider breaking up dominant companies
Explanation
Unexpected because Bruno represents the European Commission (regulatory authority) while Camila represents civil society, yet both acknowledge that current enforcement may need to be more aggressive, including considering company breakups
Topics
Legal and regulatory | Economic
Technical feasibility of ethical alternatives without compromising user experience
Speakers
– Hannah Taieb
– Bruno Carballa SmichoWSki
Arguments
Technical solutions exist for personalization without sensitive data collection, but stronger regulation needed
Open source solutions and public procurement requirements can promote alternatives to big tech dominance
Explanation
Unexpected consensus between a business representative and a regulatory researcher that ethical alternatives are technically viable and don’t require sacrificing user experience, challenging industry narratives about necessary trade-offs
Topics
Economic | Legal and regulatory | Human rights
Overall assessment
Summary
Strong consensus emerged around the interconnection between economic concentration and human rights impacts, the technical feasibility of ethical alternatives, the importance of interoperability, and the need for cross-regulatory coordination
Consensus level
High level of consensus with significant implications for policy development. The agreement between regulatory, business, and civil society perspectives suggests a mature understanding of platform governance challenges and potential solutions. This consensus could facilitate more integrated policy approaches that address both economic and human rights concerns simultaneously.
Differences
Different viewpoints
Primary regulatory approach – economic vs. human rights focus
Speakers
– Bruno Carballa SmichoWSki
– Camila Leite Contri
– Raquel da Cruz Lima
Arguments
DMA targets gatekeepers with economic objectives to reduce market power and prevent abuse
Digital platforms act as gatekeepers of human rights, with economic concentration directly impacting fundamental rights
States have constitutional duty to consider human rights in all regulatory decisions including competition matters
Summary
Bruno emphasizes DMA’s purely economic objectives with indirect human rights effects, while Camila and Raquel argue for direct integration of human rights considerations into competition law and regulatory frameworks.
Topics
Legal and regulatory | Human rights | Economic
Scope of regulatory intervention – targeted vs. comprehensive approach
Speakers
– Bruno Carballa SmichoWSki
– Camila Leite Contri
Arguments
DMA creates new obligations for platforms including allowing business outside platforms, data access, app uninstallation, and interoperability
Competition authorities need bolder theories of harm and should consider breaking up dominant companies
Summary
Bruno supports targeted regulatory obligations for gatekeepers, while Camila advocates for more aggressive intervention including company breakups as necessary solutions.
Topics
Legal and regulatory | Economic
Role of public vs. private solutions in addressing platform dominance
Speakers
– Bruno Carballa SmichoWSki
– Hannah Taieb
Arguments
Brazil’s PIX payment system demonstrates successful public alternative through mandatory interoperability requirements
Private companies can develop ethical, controllable recommendation algorithms while maintaining good user experience
Summary
Bruno emphasizes public infrastructure solutions like PIX as effective alternatives, while Hannah focuses on private sector innovation and ethical algorithm development as viable market solutions.
Topics
Economic | Infrastructure | Alternative Platforms and Market Solutions
Unexpected differences
Effectiveness of current regulatory timeline and enforcement speed
Speakers
– Bruno Carballa SmichoWSki
– Camila Leite Contri
Arguments
Four enforcement cases already opened against Apple and Meta with significant fines imposed
Competition authorities need bolder theories of harm and should consider breaking up dominant companies
Explanation
Unexpectedly, Bruno presents DMA enforcement as relatively successful with four cases and significant fines in just two years, while Camila argues this approach is insufficient and calls for much more aggressive action including breakups. This suggests a fundamental disagreement about whether current regulatory pace is adequate.
Topics
Legal and regulatory | Economic
Overall assessment
Summary
The main areas of disagreement center on regulatory philosophy (economic vs. human rights focus), intervention intensity (targeted obligations vs. company breakups), and solution approaches (public infrastructure vs. private innovation). Despite shared concerns about platform dominance, speakers differ significantly on implementation strategies.
Disagreement level
Moderate to high disagreement on methods and approaches, but strong consensus on the fundamental problem of platform dominance. The disagreements reflect different professional backgrounds and jurisdictional perspectives, which could complicate coordinated global responses but also provide diverse policy options for different contexts.
Partial agreements
Partial agreements
Similar viewpoints
Current regulatory approaches are insufficient and need to be more aggressive, while also ensuring broader participation from civil society in shaping these approaches
Speakers
– Camila Leite Contri
– Audience
Arguments
Competition authorities need bolder theories of harm and should consider breaking up dominant companies
Civil society organizations need empowerment to participate in market-oriented regulatory discussions
Topics
Legal and regulatory | Economic | Human rights
Platform opacity and algorithmic control enable manipulation of information and political discourse, demonstrating how technical design choices have political consequences
Speakers
– Hannah Taieb
– Camila Leite Contri
Arguments
Algorithm opacity and lack of contextual framing contributes to misinformation and undermines source visibility
Google’s political interference during Brazil’s fake news bill debate demonstrates how economic power translates to political influence
Topics
Human rights | Sociocultural | Economic
Public digital infrastructure can effectively compete with private platforms when properly designed and regulated, and this approach is particularly important for Global South development
Speakers
– Bruno Carballa SmichoWSki
– Audience
Arguments
Brazil’s PIX payment system demonstrates successful public alternative through mandatory interoperability requirements
Infrastructure development and funding for digital public alternatives essential for Global South countries
Topics
Development | Infrastructure | Economic
Takeaways
Key takeaways
Economic concentration in digital markets directly impacts human rights, particularly freedom of expression and access to information
The EU’s Digital Markets Act (DMA) provides a regulatory model that other jurisdictions like Brazil can adapt, focusing on preventing gatekeeper platforms from abusing their market power
Technical solutions exist for ethical algorithms and personalization without extensive data collection, but stronger regulation and incentives are needed to implement them
Public alternatives like Brazil’s PIX payment system can successfully challenge big tech dominance when combined with mandatory interoperability requirements
Cross-disciplinary dialogue between competition law, human rights, and technology experts is essential for effective platform governance
Traditional media faces economic pressures while algorithm-driven platforms concentrate information distribution, requiring new business models that prioritize transparency and user control
Resolutions and action items
Civil society organizations should learn market language and engage more actively in competition law discussions
Competition authorities should adopt bolder theories of harm and consider breaking up dominant companies
Governments should promote open source alternatives through public procurement requirements
Brazil should focus on developing digital public infrastructure and cloud alternatives to reduce dependency on big tech
Regulators should integrate human rights considerations into competition law analysis and enforcement
Stronger regulation needed to require algorithmic transparency and interoperability
Article 19’s policy paper ‘Taming the Big Tech’ should be consulted for unbundling solutions for social media platforms
Unresolved issues
How to overcome network effects that keep users on dominant platforms despite availability of alternatives
How Global South countries can develop technological infrastructure to compete with established gatekeepers
What specific incentive structures would effectively encourage users to adopt alternative platforms
How to balance the need for platform regulation with concerns about fragmenting the digital public sphere
What institutional design works best – separate regulations for economic and human rights issues versus integrated approaches
How to address the revolving door problem between government and big tech companies
What funding mechanisms can support ethical tech vendors and open source solutions at scale
Suggested compromises
Jurisdictional learning approach where countries can be ‘second movers’ and learn from EU’s DMA implementation mistakes and successes
Layered approach to alternatives – public infrastructure for some services, open source for government use, and regulation for private markets
Cross-fertilization between different regulatory frameworks (DMA and DSA) sharing procedures and monitoring techniques while maintaining distinct objectives
Cooperation between data protection and competition authorities to address overlapping concerns without conflicting decisions
Supporting both public alternatives and private ethical tech vendors rather than choosing one approach exclusively
Contextual advertising models that provide transparency about ad targeting while maintaining media funding mechanisms
Thought provoking comments
We currently have a society that is tech-mediated, our citizenship is tech-mediated… in Brazil we still have this zero rating practices, in which people that use prepaid mobile data… they mostly use the platform, the applications that don’t spend their mobile cap. So, we currently have people that have, for example, per month, four gigabytes, and WhatsApp, Facebook, and TikTok don’t spend internet, so why would someone have an incentive to use another platform, and how this is important to how the debate is developed on how people express themselves.
Speaker
Camila Leite Contri
Reason
This comment brilliantly connects economic inequality to digital rights violations, showing how market structures create barriers to free expression for lower-income populations. It demonstrates how seemingly neutral business practices (zero rating) actually entrench platform monopolies and limit democratic discourse.
Impact
This shifted the discussion from abstract regulatory concepts to concrete examples of how economic concentration affects human rights in practice. It grounded the theoretical framework in real-world inequality and influenced subsequent discussions about alternatives and infrastructure needs in the Global South.
During the week of the votation of the Brazilian DSA… Google put in the main website… ‘how can the bill, the fake news bill can worsen your internet?’… And when would you search for a fake news bill? The first link that would appear would be a sponsored link by Google saying no to censorship bill. So how can we have a free space of debating when the… basically the only search platform that people use in practice put this and change the whole debate.
Speaker
Camila Leite Contri
Reason
This example powerfully illustrates how economic dominance translates into political power, showing concrete evidence of how platforms can manipulate democratic processes. It demonstrates the concept of ‘political self-preferencing’ – extending the DMA’s economic self-preferencing rules into the political sphere.
Impact
This comment introduced a new dimension to the discussion by showing how competition law violations can directly undermine democratic processes. It elevated the conversation from market efficiency concerns to fundamental questions about democracy and political manipulation, influencing how other panelists framed the urgency of regulation.
I always felt kind of isolated in both fields, both in competition law, where you’re talking about human rights in the digital sphere, and both in civil society, in the human rights side, talking within the language of market… monopoly competition issues are key to human rights and we should analyze them together.
Speaker
Camila Leite Contri
Reason
This meta-commentary on the artificial separation between competition law and human rights advocacy identified a crucial structural problem in how these issues are typically addressed. It challenged the siloed approach that weakens both fields.
Impact
This comment set the tone for the entire discussion by explicitly calling for interdisciplinary dialogue. It validated the workshop’s premise and encouraged other participants to think beyond their traditional disciplinary boundaries, leading to more integrated analysis throughout the session.
The way personalization is done… largely by collecting private data with no or little regard for actual transparency or contextual understanding… as earlier generations were encountering information with very well-defined context in environments such as newspapers… today, many young users, the only information they encounter comes through feeds, where logic is invisible and intrusive.
Speaker
Hannah Taieb
Reason
This insight connected the loss of editorial context to the rise of algorithmic curation, showing how the shift from traditional media to platform-mediated information fundamentally changes how citizens engage with information and democracy.
Impact
This comment deepened the discussion by introducing the concept of ‘contextual framing’ as a democratic necessity. It influenced the conversation toward solutions focused on transparency and alternative business models, and connected technical algorithmic issues to broader questions about informed citizenship.
PIX… can counterbalance market power in a very strong way. But of course, these have to come with a proper regulation that makes them a real alternative… we forced all the companies, digital and non-digital, to be interoperable with PIX.
Speaker
Bruno Carballa SmichoWSki
Reason
This example provided a concrete model for how public digital infrastructure can successfully challenge private platform dominance, showing that alternatives are possible when combined with smart regulation requiring interoperability.
Impact
This comment shifted the discussion from purely regulatory approaches to hybrid public-private solutions. It gave concrete hope to participants from the Global South who were asking about alternatives to Big Tech dominance, and influenced the conversation toward practical policy solutions rather than just theoretical frameworks.
Why can’t we understand the human rights impact and bring this into competition authorities? But this demands bold public servants… I hope other authorities, such as the Brazilian ones, have this same openness.
Speaker
Camila Leite Contri
Reason
This direct challenge to regulatory authorities to expand their mandate and consider human rights impacts in competition cases was both a call to action and a recognition that institutional change requires individual courage within bureaucratic systems.
Impact
This comment personalized the regulatory challenge and created a direct dialogue between civil society and regulatory officials (Bruno). It moved the discussion from abstract policy to the human agency required for institutional change, and set up a framework for ongoing collaboration between different stakeholder groups.
Overall assessment
These key comments fundamentally shaped the discussion by breaking down artificial barriers between economic and human rights analysis. Camila’s interventions were particularly transformative, providing concrete examples that grounded theoretical concepts in lived experience and democratic practice. Her comments about zero-rating and Google’s political interference demonstrated how market concentration directly undermines human rights, while her call for interdisciplinary dialogue set the collaborative tone for the entire session. Hannah’s insights about algorithmic curation and contextual framing added crucial technical depth, showing how business model changes could support democratic values. Bruno’s PIX example provided hope and practical direction for Global South participants. Together, these comments created a conversation that was both analytically rigorous and practically oriented, successfully bridging the gap between competition law, human rights advocacy, and technical innovation. The discussion evolved from separate disciplinary perspectives to an integrated framework for understanding digital platform power as fundamentally both an economic and democratic challenge.
Follow-up questions
Should generative AI be included under a new category in the DMA or does it fit into existing categories like search engines?
Speaker
Bruno Carballa SmichoWSki
Explanation
This is an ongoing discussion about how to regulate emerging AI technologies within the existing DMA framework, which is crucial for determining regulatory scope and enforcement.
How can the Global South increase protagonism in competition scenarios when lacking infrastructure to create alternatives to dominant platforms?
Speaker
Laura (audience member from Brazil)
Explanation
This addresses the fundamental challenge of developing competitive alternatives in regions with limited technological infrastructure and resources.
How can incentives to use big tech services be diminished when alternatives are available but big tech platforms remain easier to use?
Speaker
João (audience member from Brazil)
Explanation
This explores the practical challenge of user adoption of alternatives despite regulatory changes that create more options.
How is Brazil handling local champions, and are there nationwide platforms that could be considered local champions?
Speaker
Jacques Peglinger
Explanation
This examines how competition policy addresses domestic market leaders versus international platforms, which is important for understanding comprehensive market regulation.
How can human rights organizations be empowered to join economic-oriented regulatory conversations about platform governance?
Speaker
Beatriz (University of Sussex)
Explanation
This addresses the need for interdisciplinary collaboration between human rights advocates and competition/market regulators for more holistic platform governance.
What are the priorities in terms of regulations to increase competition and create markets more respectful of human rights?
Speaker
Raquel da Cruz Lima
Explanation
This seeks to identify the most important regulatory interventions needed to achieve both competitive markets and human rights protection.
Are there changes in breaking up the advertising market concentration and making it more aligned with human rights?
Speaker
Raquel da Cruz Lima
Explanation
This examines whether regulatory efforts are successfully addressing the concentrated advertising market dominated by major platforms.
How can competition authorities develop bolder theories of harm to address the unmeasurable impact of big tech on society?
Speaker
Camila Leite Contri
Explanation
This explores how competition law enforcement could be strengthened to better address the broader societal impacts of platform dominance beyond traditional economic harms.
How can Brazil focus on data sovereignty and address concerns about revolving doors between government and big tech cloud companies?
Speaker
Camila Leite Contri
Explanation
This addresses the need to examine conflicts of interest and dependency issues in critical digital infrastructure decisions.
How can digital public infrastructures be funded and alternatives created from both small companies and the public sector?
Speaker
Camila Leite Contri
Explanation
This explores practical mechanisms for developing competitive alternatives through public investment and support for smaller market players.
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Related event
