WS #179 Privacy Preserving Interoperability and the Fediverse
24 Jun 2025 13:30h - 14:30h
WS #179 Privacy Preserving Interoperability and the Fediverse
Session at a glance
Summary
This panel discussion focused on privacy preservation within the Fediverse, also known as the open social web, which enables interoperability between different social media platforms through protocols like ActivityPub. The session was moderated by Mallory Knodel from the Social Web Foundation and featured panelists from various organizations including the Data Transfer Initiative, Meta, and academic researchers specializing in digital competition law.
The discussion explored fundamental challenges in maintaining user privacy when data flows between interconnected but independently operated platforms. A key technical issue highlighted was that when users delete posts on one platform, those deletions may not propagate across all federated services that have already received the content. This creates complications for compliance with privacy regulations like GDPR, which grants users rights to delete their personal data.
Panelists examined how existing privacy laws apply to decentralized systems, noting that current regulations were designed for centralized platforms and may need adaptation for federated environments. The European Union’s Digital Markets Act was discussed as potentially expanding interoperability requirements from messaging services to social media platforms in future reviews.
The conversation addressed governance challenges in distributed systems, emphasizing the need for shared standards and trust mechanisms across multiple platforms and instances. Speakers highlighted the importance of user education about how federated systems work, as many users may not understand that their posts can travel beyond their original platform.
The panel concluded that while the Fediverse offers promising opportunities for user agency and platform choice, realizing privacy-preserving interoperability requires ongoing collaboration between platforms, standards bodies, regulators, and civil society to develop appropriate technical and governance frameworks.
Keypoints
## Major Discussion Points:
– **Privacy challenges in interoperable social media systems**: The panel explored how federated platforms like the Fediverse can preserve user privacy while enabling cross-platform interaction, including technical issues like post deletion across multiple instances and the complexity of maintaining user control over data that flows between different services.
– **Regulatory frameworks and compliance in decentralized environments**: Discussion focused on how existing privacy laws like GDPR and emerging regulations like the Digital Markets Act apply to interoperable social media, with particular attention to the challenges of enforcing data subject rights (deletion, portability, access) across distributed networks.
– **Technical implementation and standardization of interoperability**: The conversation addressed the practical aspects of building interoperable systems, including the role of standards bodies like W3C and IETF, the importance of shared governance models, and the need for trust mechanisms between different platforms and instances.
– **User experience and education in federated systems**: Panelists discussed the challenge of making complex interoperable systems accessible to average users, including onboarding processes, setting appropriate defaults, and helping users understand what happens to their data when it crosses platform boundaries.
– **Governance and responsibility distribution across diverse stakeholders**: The panel examined how to manage a decentralized ecosystem involving thousands of instances and platforms, including the development of shared norms, trust registries, and coordination mechanisms that don’t centralize control but provide necessary oversight.
## Overall Purpose:
The discussion aimed to explore how privacy can be preserved and user rights protected in interoperable social media environments, particularly the Fediverse, while addressing the technical, legal, and governance challenges that arise when data and interactions flow across multiple platforms and instances.
## Overall Tone:
The discussion maintained a collaborative and constructive tone throughout, with panelists building on each other’s points rather than debating. The conversation was technical but accessible, with speakers acknowledging the complexity of the issues while expressing cautious optimism about solutions. The tone became more interactive and engaged during the Q&A portion, with audience questions adding practical perspectives and real-world concerns to the theoretical framework established by the panel.
Speakers
– **Mallory Knodel** – Executive Director and co-founder of the Social Web Foundation, focused on building a multipolar Fediverse with emphasis on human rights and community building
– **Chris Riley** – Academic/Professor, co-author of “Regulating Code” book, works on interoperability and digital policy issues, currently based in Melbourne, Australia
– **Delara Derakhshani** – Director of Policy and Partnerships at the Data Transfer Initiative (DTI), focused on data portability and user agency in data transfers
– **Ian Brown** – Technologist and academic, security and privacy expert, co-author of “Regulating Code,” works on digital competition law reform and interoperability, particularly with EU Digital Markets Act
– **Audience** – Various audience members who asked questions during the Q&A session
– **Melinda Claybaugh** – Director of AI and Privacy Policy at Meta, working on Threads product and Fediverse interoperability
**Additional speakers:**
– **Edmund Chung** – From .Asia organization
– **Dominique Zelmercier** – From W3C (World Wide Web Consortium)
– **Caspian** – Audience member who asked about 10-year vision
– **Winston Xu** – From One Power Foundation in Hong Kong
– **Gabriel** – Audience member who asked about trust between instances
– Various other unnamed audience members who asked questions
Full session report
# Privacy Preservation in the Fediverse: Panel Discussion Report
## Introduction and Panel Setup
This panel discussion was moderated by Mallory Knodel from the Social Web Foundation as part of an Internet Governance Forum (IGF) session examining privacy challenges in federated social media systems. The session brought together experts to discuss how user privacy can be maintained when data flows between independently operated but interconnected platforms using protocols like ActivityPub.
The panelists included:
– Ian Brown, discussing regulatory frameworks and interoperability requirements
– Melinda Claybaugh from Meta, representing industry perspectives on federated systems (her full introduction was cut off in the recording)
– Dominique Zelmercier from the W3C, bringing standards body expertise
– Delara Derakhshani from the Data Transfer Initiative, focusing on user data portability
– Chris Riley, contributing insights on user experience and interoperability myths
The discussion emerged against the backdrop of growing interest in decentralised social media alternatives, particularly following user migrations from centralised platforms, with Chris Riley noting people “noisy quitting X” and subsequently “quiet quitting Mastodon.”
## Core Technical Challenges
### Data Deletion and Cross-Platform Propagation
Mallory Knodel highlighted a fundamental technical problem: when users delete posts on one platform, those deletion requests may not effectively propagate across all federated instances that have already received the content. This creates complications for privacy regulation compliance, particularly with GDPR requirements for data deletion.
Ian Brown provided a concrete example of this challenge, describing a scenario where a user sets their Mastodon posts to auto-delete after two months but uses a bridge service (Bridgyfed) to share content with BlueSky. The user’s privacy preferences from Mastodon do not automatically transfer to BlueSky, creating gaps in privacy protection across the federated network.
### Bridge Services and Privacy Preferences
The discussion revealed significant challenges with bridge services that connect different federated platforms. Ian Brown noted that Bridgyfed, which bridges content between platforms, doesn’t currently recognise or apply user privacy preferences across different systems. This technical limitation demonstrates the gap between user expectations and current implementation reality.
### Standards and Implementation Complexity
Dominique Zelmercier from the W3C raised concerns about the granularity of privacy controls in standardised systems. They observed that standardisation processes typically produce simple binary signals—”do or do not”—which may be inadequately coarse for the complexity of social media interactions where users want nuanced control over their digital presence.
## User Experience and Understanding
### User Comprehension Challenges
Melinda Claybaugh emphasised that many users may not understand what it means to post content on federated platforms or comprehend how their content might be shared across networks. She noted that users posting on Threads, for instance, may not realise their content could be shared with other services in the federation.
The discussion revealed that different user types—power users, technical users, and average users—require different levels of explanation and control options. Claybaugh stressed the importance of meeting people where they are and designing systems that accommodate varying levels of technical literacy.
### The “Myth of the Superuser”
Chris Riley made a significant intervention challenging what he called the “myth of the superuser.” He argued that the goal should be making interoperability invisible to users rather than requiring them to become technical experts. Riley emphasised that most users want good defaults rather than complex configuration options, creating tension between user education approaches and user experience simplicity.
## Regulatory Framework Discussion
### Existing Privacy Laws and Federation
Ian Brown discussed how existing privacy regulations like GDPR apply to decentralised systems, noting these laws provide important legal backstops against bad actors who ignore user data preferences. However, he acknowledged that current regulations were designed with centralised platforms in mind.
Brown referenced GDPR Article 22 (though noting he was speaking from memory and would need to verify) regarding automated decision-making, highlighting how existing legal frameworks need fresh implementation approaches for federated environments.
### Digital Markets Act Implications
Ian Brown provided insights into the European Union’s Digital Markets Act (DMA), which currently includes messaging interoperability requirements for designated gatekeepers. He presented slides showing the complexity of regulatory requirements, which he acknowledged were “too complicated” and “impossible to read,” illustrating the challenge of implementing nuanced regulatory frameworks.
Brown suggested that regulatory frameworks are evolving to require dominant firms to enable interoperability, representing a shift from traditional competition law approaches.
## Governance and Instance-Level Management
### Defederation as Governance Tool
Mallory Knodel highlighted defederation—where instances disconnect from others—as a mechanism for protecting users from bad actors whilst maintaining overall system openness. This represents a form of distributed governance that allows communities to maintain their safety standards without requiring centralised oversight.
### Government and Institutional Roles
Knodel made a specific policy recommendation, suggesting that “maybe governments should have instances where they’re the source of truth that’s verified.” This reflects broader discussions about how institutional actors might participate in federated systems whilst maintaining credibility and user trust.
Ian Brown supported this direction, arguing that governments should share information across multiple platforms rather than forcing citizens to use single platforms for accessing public services.
## Audience Questions and Key Responses
### Digital Identity and Democratic Institutions
An audience member raised questions about digital identity ownership and its relationship to democratic institutions. The panelists discussed how federated systems might support more democratic approaches to digital identity management, though specific solutions remained largely theoretical.
### Technical Standards Bodies Role
Questions about the role of standards organisations like W3C and IETF revealed their crucial function in developing interoperability standards and verification processes. The discussion highlighted both the importance of these bodies and the limitations of current standardisation approaches for complex social media interactions.
### Youth and Internet Interoperability
An audience question about youth involvement in internet interoperability prompted discussion about generational differences in approaching federated systems and the importance of including diverse perspectives in governance discussions.
### Cultural Context and Platform Migration
A significant audience question addressed how cultural context changes when content moves between platforms with different community norms. This highlighted that interoperability involves not just technical challenges but also social and cultural considerations that affect user safety and community dynamics.
### Trust and Security Between Instances
Questions about trust between instances and security concerns revealed ongoing challenges in federated systems. The panelists acknowledged that security vulnerabilities, particularly around direct messages and encrypted content, require continued attention as federation scales.
## Data Portability and User Migration
### Beyond Technical Transfer
Delara Derakhshani provided crucial insights into the complexity of platform migration, noting that it involves not just data transfer but joining new communities with different safety considerations. For marginalised groups particularly, the safety of the destination platform matters as much as the ability to bring content.
This reframed interoperability from a purely technical challenge to a human-centred one, highlighting that data portability is fundamentally about community safety and user agency.
### Trust Registries and Verification
Derakhshani emphasised the need for trust registries and verification processes to reduce duplication and streamline onboarding across platforms. These mechanisms could provide coordination without centralising control, addressing fundamental challenges of distributed governance.
## Future Vision and Ongoing Challenges
### Long-term Interoperability Goals
Mallory Knodel described a future where users never have to sign up for new services whilst still being able to follow people across platforms. Ian Brown envisioned social media becoming as configurable and flexible as the internet at the IP layer.
### Persistent Technical Issues
Several technical challenges remain unresolved, including:
– Propagation of deletion requests across federated networks
– Security vulnerabilities in direct messages and encrypted content
– Maintaining consistent user experiences across platforms with varying capabilities
– Handling privacy controls and auto-deletion features across different systems
### Scaling Governance
As federated systems potentially scale to thousands of instances, new coordination mechanisms will be needed to maintain trust and shared standards. The panelists acknowledged that current approaches may not be adequate for the scale of federation envisioned.
## Key Takeaways
The discussion revealed both the promise and complexity of building privacy-preserving federated social media systems. While technical solutions for basic interoperability exist, significant challenges remain in:
1. **Privacy Preservation**: Ensuring user privacy preferences and deletion requests propagate effectively across federated networks
2. **User Understanding**: Helping users comprehend the implications of posting in federated environments without overwhelming them with technical complexity
3. **Governance at Scale**: Developing coordination mechanisms that maintain standards and user protections across distributed systems
4. **Regulatory Adaptation**: Implementing existing privacy laws in federated contexts while potentially developing new regulatory approaches
The panelists emphasised that solving these challenges requires sustained collaboration between platforms, standards bodies, regulators, and civil society organisations. Success depends not just on technical innovation but on creating governance structures and user experiences that serve diverse communities while maintaining the openness and user agency that make federation attractive.
As Mallory Knodel noted with humour, the discussion completed her “bingo card” of IGF topics, reflecting how federated social media intersects with many core internet governance challenges that the community continues to address.
Session transcript
Mallory Knodel: Hi, everyone. Welcome to the session. I want to first start with a short bit of housekeeping, which is that if you’ve joined us in person, we’re so grateful, and we also have loads of seats up here at the roundtable that we would invite you to occupy if you’d like to. It means you get a microphone that’s always on, so you could ask questions directly. Otherwise, we’ll take questions towards the end, and there are mics also out in the audience, so it’s your choice. We’re also monitoring, for those of you who’ve joined online, we’ll be monitoring the questions that you drop in there as well. So welcome. We’re talking today about the Fediverse, also known as the open social web. By design, imagine this as the openness means that there’s an interoperability element that’s very critical, and we are curious in this panel about a lot of things, but particularly about the ways in which privacy can be preserved in these interoperable environments. So you’re all internet experts. You understand interoperability, and you’re here because you’re excited about the web, and we are too. So this is the panel up before you. We don’t have any remote panelists. I’m going to allow the panelists to really present this idea to you in stages. So if you’re not sure what the Fediverse is, if you’re not sure about what privacy and interoperability have to do with each other, I’m confident that the questions that we’ve posed to one another are going to help build that narrative for you, and of course come with your questions. As well, I’m going to ask the panelists, please, the first time they speak, if they can just give a brief intro into where they work, their interest in the topic. And so that way, I don’t have to read bios aloud to you all. But I should start with myself, actually, right? So they’re all switched on all the time. You have to wear a headset, though. OK, OK, that would help. So yeah, I’m Mallory Knodel. I am the executive director, one of the founders of the Social Web Foundation, which sees itself really as a steward of a multipolar Fediverse. And so that does have elements of protocol. There’s the ActivityPub protocol. There are other open protocols that are hopefully, towards the future, interoperable with each other. It also has this very strong element of community building that we all can build and construct the communities we want online, move them around, subscribe, follow one another without having impediments to that. It’s a really great vision. And I think one of the crucial pieces for me, as someone who’s been in this space with you all for many years, is this element of human rights and of we’re building a Fediverse, or we’re building an open internet, but for what? And so human rights and that sort of thing brings me to this work. So that’s me. I wanted to pose the initial question to the panel, where each person can introduce themselves. Just what do you think we need to know as a level set for this issue of privacy interoperability? It can be a short intervention, but just to try to introduce the audience and the participants to what are the edges of this issue, and why are we sort of here today, what do we care about? So keep the very first initial question maybe two minutes, when the rest of our questions will be maybe around more four or five minutes. So Delara, can I start with you, please?
Delara Derakhshani: Yes, absolutely. Thank you all so much for being here. here and for having me. So my name is Dilara Derekshani. I serve as Director of Policy and Partnerships at the Data Transfer Initiative. Our entire mission is to empower users with regard to data portability. I think it’s helpful to give just a little bit of context about what we do and you’ll see how it’s connected to our work in the Fediverse. When we’re talking about data portability in this context, it essentially means at the user’s request, companies should move bulk transfers of user’s personal data from one platform or service to another. Our work at DTI is centered on translating legal principles into real world practice. On the product side, we are building open source tools with private sector companies. And on the policy side where I sit, we work closely with regulators and policymakers to serve both as a resource and help implement portability solutions that are both meaningful and workable in practice. Ultimately, our goal is an ecosystem that promotes user agency, competition and innovation. And I am heartened to see so many folks at the table today because a lot of these issues are absolutely relevant to the Fediverse and looking forward to diving in deeper.
Mallory Knodel: Great, awesome. Let’s go to you next, please.
Melinda Claybaugh: Hi, thanks for having me and it’s a pleasure to be here. I’m Melinda Claybaugh. I’m a Director of AI and Privacy Policy at Meta. And I’m here today, our interest in the Fediverse and interoperability has really crystallized around our product we launched a couple years ago called Threads, which most people are familiar with Facebook, Instagram, WhatsApp, all of those apps. Threads is our newest app and it is…
Ian Brown: I’ve been working on since 2008 with my co-author, who just happens to be sitting over there, Professor Chris Marsden. We wrote a book in 2013 called Regulating Code, where we talked about interoperability as a solution to a number of difficult policy trade-offs and issues. And I’m really delighted to see it so high on the radar of many regulators. I spent a lot of the last five years talking to EU policymakers because they have actually put it into a law called the Digital Markets Act, which passed a couple of years ago. And I’m going to show you later a couple of tables. I’m not going to talk through them I’m not going to do death by PowerPoint. Don’t worry, but just just point us to a lot more information If you want detailed technical background, I’m a technologist I’ve written a lot about it at my website. You can see it on my website Ian Brown tech I should disclose that earlier this year Metta commissioned from me an independent assessment and it was independent of Of certain aspects of how the digital markets interoperability obligation is being applied to iOS and to iPad OS and the privacy and security Implications of that. I’m actually a security and privacy person by background Although I’ve done a lot in competition digital competition law reform the last five years and I should also disclose that I’m about to fingers crossed start consulting for the UK Competition and Markets Authority on precisely this topic
Mallory Knodel: Yeah, so you can see the spread we’ve got an awesome panel I can tell also from the audience We’ve got some really important stakeholders in this room that know a lot about this topic So I’m counting on you to come in later on with your questions and there are mics there There are a whole bunch of mics up here if you want to sit and take one you are welcome But let me get on with it. So in my view, this is an obvious question, right and we see it a lot There’s an intuition that users have so the the same kinds of users that are excited about using open social protocols for You know consuming Content articles sharing pictures and so on are the same users that are going to care a lot about their privacy and so when this comes into play there, you know if we have more of a permissionless ecosystem Where different entities are now not just maybe in business relationships, which is something I think we’re attuned to in the old social media, but in the new social media, it’s more of you know instances Allowing allow listing block listing that sort of thing. I think there’s an intuition that some users have that and Chris Riley. We’re going to start with a question from a questioner. They said they might take those actions at the instance level maybe because of privacy concerns or maybe because they don’t want to be in relationship with or associate with other corners of this very open interoperable ecosystem. So that’s the sort of general reason why I think we’ve gathered today and hopefully we’ve started out strong by explaining why each of us are here and why we care about that. So first to Dilara, so and I have Ian, I want you to respond also to this question, but first, how can interoperable systems respect user privacy and give users control over their data?
Delara Derakhshani: So I’ve thought about this question a lot, and one of the things I want to start with is what I think, what success would look like in an ideal world, and I think at its core it starts at user agency. Users should know what’s being shared, when, with whom. And why, and interoperability shouldn’t necessarily mean default exposure. You know, there are technical design issues, real-time post visibility across services, activity portability, and, you know, social graph portability as well. I mean, I think there’s an education component that could be furthered to the benefit of all in the ecosystem. But maybe most important is, you know, the idea of, you know, what is the best way to share data? What is the best way to share data? But maybe most important is actually a cultural question, a commitment to shared governance and iterative collaboration. Working in the open is a start, but true privacy-respecting interoperability demands an ongoing habit of listening, engaging with standard bodies, with respect and stay within the space of a secure dialogue that reaches out to everyone at public IoT and later implemented in Azure remote control. And these all strike me as sort of some of the ideals that we should have, I will also note there are at its core a set of smart There are some challenges that come with interoperability. I mean, it’s all about freedom of movement, enabling users to take their posts, profiles, and relationships across platforms. It challenges lock-in, a goal the Fedverse aspires to, but some may argue has not fully delivered on yet. And of course, interoperability is not without its challenges. For example, people may be able to want to move from one server to another and bring all of their content and relationships with them. That may not always be possible. I can speak later to some of the solutions that we’re looking at at DTI. And something that you’ve touched on earlier, and I think it’s relevant to human rights. You know, there’s another challenge, and that’s that decentralized systems like the Fedverse can empower users to find and build communities. But inevitably, that introduces friction. Different services have different norms, policies, controls. It can lead to inconsistent experiences. But beyond that, migration is not just about exporting and importing data. It’s about joining new communities. We’ve heard repeatedly in my experience and our work at DTI that for many users, particularly marginalized groups, the safety of the destination matters just as much as the ability to bring the content. So trust-building tools must also not just address privacy, but also moderation, consent, and survivability. And in recognition of time, I’ll stop there. I’m happy to expand further on any of those points down the line.
Mallory Knodel: Yeah, especially if there are questions from the audience about that. Ian, I want you to respond, but because you’re a technologist, I’m going to ask you if you can slightly explain, maybe for folks who don’t know, that like on ActivityPub, for example, if you delete a post, that doesn’t necessarily mean it’s deleted everywhere. And you could maybe try to hook in some of the actual ways in which these… and I’m going to talk about how these things work and the limits there for privacy and so on. But whatever you were going to say, I’ll do that too.
Ian Brown: That’s good. That’s a challenging question, but I like challenging questions. So, let me say, could our technical friends at the back, I wonder if you could put my first image up on screen, the XKCD cartoon. I’ll just leave you to look at it at your convenience. So, interoperability, I think, first of all, it sounds, to people who aren’t deep in the weeds of this, it sounds a very abstract, weird, geeky, techy notion, and it’s not an end in itself. It’s a means to greater competition, greater user choice, diversity of services, ability of people to talk across communities. I love that idea. It helps people join new communities, try out new communities. You can see this. I wonder if we could have an element of audience interactivity, as we always are told to do in universities to make sure all the students are awake. Put your hands up if you’re on Blue Sky. Yeah, a lot of people. I thought in this IGF community that would be the case. Via a bridge. Well, I was going to come to that. That’s good. That’s also good. Who’s also or differently on Mastodon or another part of the Fediverse? And still quite a few people, which, again, I imagine, I’m not going to go and ask people individually. More of the technical community are on Mastodon for various reasons. More of the policy community on Blue Sky. As Mallory said, there are services now called bridges, which let people, I think the opt-in approach is absolutely right. People who want to can choose to say, OK, from my Mastodon account, I want people on Blue Sky to be able to follow me on Blue Sky and to like my posts and to repost them and to reply to me. So I’ll see them on Mastodon and vice versa. Mallory got straight to one of the one of the important technical questions, which I think this XKCD cartoon, which someone in a tech company just sent me yesterday evening, knowing I was talking on this panel today, the details are very important for human rights, for protecting privacy, for enabling freedom of expression and opinion. for making sure people don’t get harassed by being exposed to communities they might not want to. So to give you another example, there are bridges between X, X Twitter, and some of these other services and I quite understand why many people on Blue Sky and Mastodon would not want people on X to be able to interact with them because there’s a lot of harassment that goes on on that service sadly and I deleted my own account a month or two ago. The very specific question Mallory asked, okay you might be on one service, so on Mastodon for example I know because I do it myself, you can set your posts to auto delete after X weeks or months and I do after two months because I want it to be a conversational medium, not a medium of record. I don’t write social media posts with the care that I write academic articles or submissions to parliamentary inquiries and so on. I want to talk to people, I want to share ideas, I want to be able to make mistakes. That’s a key thing that privacy enables. You can try new things out. It’s actually one of the really critical elements of children growing up, those of you who are parents or have nephews, nieces or are familiar with educational psychology, that people can make mistakes. That’s how we learn. You don’t want people to think if I make a mistake it’s going to be imprinted forevermore on my permanent record and employers, universities, governments, when I enter the United States for example, might be checking my social media posts as is the current US government policy. So that very specific question Mallory asked I think is a really good one because I think it gets to the crux of some of these issues. If say Mallory has, let’s use me as the example, so I delete my Mastodon posts after two months. I use a bridge, it’s called Bridgyfed, I love the name, which lets people on BlueSky read my Mastodon posts. BlueSky is not necessarily currently going to, you know, I have a BlueSky account but BlueSky currently does not have an option to auto delete all my posts. that I type myself into Blue Sky, nor does Bridgyfed currently pick up my preference from Mastodon to auto-delete everything after two months and then apply that on the other side of the bridge. It could, from a technical perspective it’s relatively straightforward. Also from a legal perspective I should mention in Europe at least and in the many other countries, so up to 130 other countries now who have implemented GDPR-like privacy laws and included in the GDPR, I think it’s article 22 but I’d have to check from memory, GDPR says if an organisation publishes your information in some way, if it shares it like a bridge, like a social media service and then you withdraw your consent for the use of that data, the company has to stop using it but also the company has to make best efforts to tell other organisations it’s shared your personal data with, that they should stop using it as well and I think that’s a good legal way of dealing with this issue Mallory.
Mallory Knodel: Good, well and so you set us up really nicely. I think the other, before I move on to the GDPR question, I wanted to just say I think our intuition around how social media works is clearly evolving and I think that while we rely a lot in the privacy realm and the security realm for that matter on how users think about their own data, how users think about their own experience when there’s an absence of regulation, I think this is really really critical and I think why we have to have this conversation now because the new social media is introducing so many different dimensions, it’s introducing so many ways to interact. So let’s talk about regulation because even with the existing regulation we have, even in a sort of old social media regime, there’s still these questions. So Melinda, can you tell us about what challenges you face regarding compliance with national and regional privacy regulations like GDPR and how in a decentralized environment how those challenges can either be exacerbated, alleviated, but you know essentially you know how should Fediverse platforms In instances better aligned with privacy laws in general.
Melinda Claybaugh: Yeah, it’s a really great question. And I guess this is a perfect tee up from you before me And so I think you know these cut these fundamental concepts that are enshrined in GDPR and now in many many many other laws around the world are ones that around your your data subject rights your rights to access information that you’ve shared your right to correct and To transfer and to delete I mean these are these have been enshrined for a long time now and people have come to Expect them and rely on them and operate accordingly And that works really well for things like you know if you post something on Instagram, and then you want to delete it. It’s deleted It works less well In this kind of Fediverse concept so if you post something on threads And then you delete it we will send that request along the chain to wherever you know to Mastodon or wherever it may have also flowed But we don’t have a mechanism for ensuring that that that post is deleted Down the chain, and I think this is where it’s so important That we need to take a fresh look at the existing You know data protection regimes not to say that those rights are aren’t important They should they’re important, and they should stay but the implementation I think is where we need to come together also to Dilara’s earlier point come together with some norms and expectations Within industry around how this should work equally. I think there’s a real challenge as this new social media Proliferates, I think you know it’s those in the tech community are already well-versed in in these communities But to your average user who maybe is starting on threads or starting on blue sky or you know wherever people went after Twitter You know they may not understand this network. They may not understand really what it means to be posting something on threads and then have it go to other services. And so that’s where I think the user education piece and really understanding user expectations, importantly, is going to be really critical. So at Threads, what we do is when you’re first onboarded onto Threads, we you know, kind of explain with pictures what the Fediverse is and what it means for what you’re posting and what happens when you delete. And so I think we have to really meet people where they are and understand kind of who’s a power user, who’s a tech user, and then who’s just your average user who maybe is poking around and trying out new things online, which is great. So I think the the collision of the legal regimes is one that needs to be, I guess the collision of the the new social media and the legal regime is one that needs to be worked out in the regulatory space, in the industry space among companies, and then at the user experience level as well.
Mallory Knodel: Absolutely. And thank you. Yeah, thanks for that. Hopefully there are questions from the audience about that. I think one thing I wanted to just point out is one part of how the ecosystem is evolving now, although it could always change because it’s interoperable and open and, you know, who knows what can happen, is that it feels like a lot of there are kind of two ways to get on the Fediverse. Like maybe you join a new platform for the first time because you want to try it out, like maybe Pixel Fed seems cool. And so you’re new there and you get on boarded there or you’re on already a platform that you’ve been on for a while and that platform may decide to start adopting open protocols. And so the onboarding process and that movement process looks different. And that’s kind of the beauty is that it’s it’s very diverse in how people arrive in this space. And we’re still trying to develop and communicate with end users about how that works. I’m going to move over to another regulation that you’ve already mentioned, Ian. This one’s to you. And then, Melinda, I’d love you. to respond as well. So what are the opportunities to influence the Digital Markets Act? I mean, it’s out, it’s baked, but we want to see if there’s a possibility to define or enforce interoperability requirements for the existing social media platforms. That’s, as far as I know, not part of how the DMA is structured now, but there could be some opportunities and it sounds like you’ve been working on that. So tell us.
Ian Brown: I’ve been working endlessly on that for the last five years. I must have bored all my friends to tears by now on that subject. So the DMA has a messaging interoperability requirement. So that currently applies to WhatsApp and Facebook Messenger and actually Meta, and again, to emphasise, we didn’t practise this before, we did not line up questions, we’ve never met before and my assessment ended at the end of February. But actually, I’m not always positive about Meta, but actually what Meta has done on WhatsApp privacy and interoperability is really interesting and great. I think a lot of user research in fine tuning in the same way Melinda has talked about the Threads experience, exactly the same with WhatsApp, and I think the rest of the industry could really, it would be great if Meta shares that as much as possible with other industry players and open source projects, so they can all take advantage of that great research Meta has done in that area. Civil Society groups got this close to also requiring social media, the biggest, so the DMA only applies to the very, very largest companies, which the DMA calls gatekeepers, there are only seven companies it applies to, the obvious, you can probably guess which they are, I can read the list out later if you care, but Meta is one, Apple is another, Microsoft is a third, ByteDance, TikTok is covered. Civil Society got this close to saying social media from the very, very largest companies must also be interoperable in the way that Threads is. I mean, I’m not using, actually, I am using Threads currently, but mainly I follow people on Threads from Mastodon, not from Threads, I follow Jan Le Coon, for example, who’s Meta’s chief AI scientist. because he says very interesting, quite independent-minded stuff. I’m sure Zuckerberg sometimes rolls his eyes at, you know, things like LeCun says, and that’s a good sign of independence, I would say. What can we do? You know, did civil society miss its chance? Because it persuaded the European Parliament, but it couldn’t persuade the member states, and therefore the final legislation includes it for messaging, but not for social networking services. But this is the cliffhanger. Next year, and actually every three years, every three years the European Commission has to review the operation of the legislation. That’s very standard in European law, and they have to very specifically every three years consider should Article 7, which is the messaging interoperability obligation for gatekeepers, be extended to gatekeeper social media services. And they, you know, we can say pretty much what they would be, probably, because you can look at who are the designated gatekeepers. I already said META is one. You know what social media, they’re called online social networking services in the DMA. You know what online social networking services with 45 million users in the EU, 10,000 small business users, those are the two criteria. You could figure that out for yourself if you wanted to. If the DMA, if the Commission recommended this and the European Parliament chose to act on the European Commission’s recommendation, the DMA could be extended, so that would become a legal obligation for these very largest social media companies in the way it already is for messaging. And for those of you who are not European, don’t worry, I can sort of feel you rolling your eyes that, you know, these Europeans are, you know, they love passing their baroque regulatory regimes, but what relevance does this have to us? Actually, like the GDPR, many other countries around the world are already putting these kind of principles into their own national competition laws and related laws. And I think also another trend we see is traditional competition law is very rapidly moving into this space. So though, if you want to look it up, I’ll just say one sentence about this. There was a European Court of Justice decision very recently called Android Auto, which basically says dominant firms must enable interoperability for their services full stop. Otherwise, it’s an abuse of dominance if they don’t. And I think that could have enormous effects around the world.
Mallory Knodel: I mean, you bring up this idea of jurisdiction, right? So Europe might have done GDPR, has the DMA, the DSA, etc. But I can’t even really picture how this would work if it only applied to Europeans, right? Because I mean, it seems to me, and this is maybe Melinda, I’m sure you have a response here, but to add on to it, you know, how could a company like only be interoperable in one jurisdiction rather than like, technically speaking, it would seem to me to be if you’re interoperable in one place, you’d be sort of interoperable for everyone. But anyway, that’s just my small minded thoughts on that. But tell us your reaction.
Melinda Claybaugh: I mean, I don’t have a specific answer to the jurisdiction question. But I mean, I think it does lead to a larger question around what is interoperability? What are we talking about here, right? Because you can talk about, it’s one thing to talk about, you know, text-based posts, kind of all, you know, that services are being built on a certain protocol, and it’s easier to build that from scratch, right? So something like threads and being able to federate your posts, that makes sense. Messaging, you know, is analogous to email, we kind of all understand that you should, you know, there are benefits to being able to send messages in different ways. But it’s very complicated when you start to get into encryption and encrypted messaging, you kind of start to run into the privacy and security issues. The issues then moving to a social media interoperability, writ large, right, of everything that happens in social media, is a whole other can of worms. And so I think that beyond the technical challenges, and I’m not a competition lawyer, so with the caveat, but what are we trying to accomplish here and how would this actually operate? Is this what people want? But we can leave that for another day in European discussions.
Mallory Knodel: Yeah, I do think that’s relevant, like what do people want? Because I mean, one answer to interoperability all the time on the open web is RSS. I mean, that’s a pretty basic way to just like get content out there in a way that can be consumed in any way. But I think what ActivityPub does and what other protocols are aiming to do is make it that you can push content out there, you can consume content, but you can also interact with it, comment on it, share it, re-blog it. So there’s a lot of different and more all the time, right? Wonderful thing about sort of interoperable permissionless ecosystems is it really creates new ways of interacting with content that didn’t even previously exist before. I mean, we kind of are stuck a little bit in an old school social media thing where it’s like you can click the heart, you can click the little swirly arrow thing, and you can like click the back arrow to make a comment. I mean, there’s a really, really basic ways. There are probably more, right? So Dilara, I want to round out the panel by asking you about, you know, how governance and responsibilities can be distributed so that we can steward all of this together. We’ve talked about this like very diverse and very complicated ecosystem where you can do all manner of things, but then how do we manage this together as it hopefully continues to proliferate, right? Across both existing platforms out there, right? If you run a social media platform and you’re interested or curious about interoperable platforms or interoperable protocols, we want to talk to you, but then also about brand new folks who come into this space and do things. So tell us your thoughts on that.
Delara Derakhshani: Absolutely. You know, earlier I set out a number of challenges and then I didn’t actually bother to explain any of the solutions that we were working on at DTI to address these problems. So I’m going to do that, but you know, I mean, the first thing I’ll say is that it’s pretty clear to all of us that these issues are bigger than just a single company or entity. So you know, federated ecosystems are by design distributed, but that distribution inherently obviously comes with fragmentation. We’ve talked about this already. But what DTI is doing is we’re focused on creating a sort of a shared governance infrastructure, not to centralize control, but to really provide coordination mechanisms that align responsibilities across diverse players. We’ve done this in two ways. The first is we’ve developed a trust model informed by real world use cases and with the input of a great deal of multistakeholder actors on how to establish trust during data transfers. And so, you know, this issue goes beyond just the Fediverse. We see the parallels in the broader online environment. But as I think someone referenced, you know, the Digital Markets Act, which mandates user initiated data transfers for those seven designated gatekeepers, which we can all rattle off who quickly off the top of our head. But you know, it’s silent about trust building mechanisms, and it leaves it up to each gatekeeper. And so to sort of reduce duplication and streamline onboarding across platforms, we’ve launched a trust registry now in pilot. And it allows developers to complete a verification process that, you know, doesn’t have to be duplicated and leads to harmonization and efficiency. There’s also growing recognition that trust infrastructure must scale down, not just up. , we have to recognize that they don’t always have the same resources. And then I think the only other thing I’d like to point out is that with that problem of, you know, I think it would be a good thing to improve portability. And one way we’re doing that is through initiative that we’re calling Lola, which will help users migrate safely. We’re creating shared guardrails and our work is about turning interoperability into a system that users can trust, because otherwise they won’t use it. And where community control and privacy travel together. And I think that’s a good place to stop. At the beginning of a conversation.
Mallory Knodel: Exactly. I like the idea that community control and privacy travel together. I think that’s a good place to stop. I think in a lot, you’ll hear, and I already have so far, like many of you, heard a lot of discussion still about content moderation, social media issues, and so on. And I can’t help but always think, like, right now that we have to sort of advocate and lobby, as human rights activists have done this for many years, just a few companies, right? If you just get like a few of them to like get on board with whatever current issue there is, then you can solve the problem. And in the world that we’re talking about and envisioning and hoping comes to pass, you will have thousands. And so how can you possibly imagine governing or dealing with that complex ecosystem? I think it’s exactly what you’re saying, Dilara. It’s like we now can create efficiencies actually and cross-platform institutions that have the trust that you need that then can hook in and interoperate across the whole ecosystem. So it isn’t about one platform making a decision. It’s about how have we all made the decision and how can it easily replicate across.
Melinda Claybaugh: Exactly.
Ian Brown: Just for 30 seconds, could our technical support put my second image up on screen? I’m not going to talk through it, I’m just going to point it out if you’re interested. You can see I’m not going to talk through it, but this image, and let me just flick to the third, all I want to do is make you aware they exist in the report, the independent report that Meta commissioned, which is on my website, ianbrown.tech. If you want to look at the detail, interoperability, as XKCD and many other people said, you need to get the details right, number one. Number two, it’s great that private firms like Meta and associations like DTI are doing all this research already to figure out, as Mallory said, in a shared space, and I’m very much looking forward to all your contributions the rest of this workshop, figuring out what can we learn together. I promise you, because I’ve done a lot of work for data protection authorities in Europe and the European Commission as well, regulators don’t want to regulate. They don’t have the resources to regulate at this level of complexity and get it right very often. That’s what regulation is for, to give incentives to private firms to do a good job themselves, and as Chris puts it, co-regulation is great, where you can have the private firms and civil society do the detailed work, and the regulators are only there as backstops to make sure that the public interest as well as the private interest is being genuinely represented. The reason these two slides are too complicated, you don’t have to read them now. If you’re interested, download the report, the end of the report. I’m just going to flick backwards and forwards, because they took me so long to do. I want you to enjoy the complexity of these two slides, because this is the level of complexity you have to go into when regulators do have to step in, and regulators don’t want to do this very often.
Mallory Knodel: It’s very instructive, actually. It’s impossible to read, which is maybe what you’re illustrating. Right, so I, as a moderator, am going to pat myself on the back for giving us 20… We have 51 whole minutes for the Q&A. We are not cutting this short because I really, really do want to hear from the audience. I want your questions. We would love to be able to respond. So you can make your way to the mics if you’re sitting out there, and I’ll just try to figure out a cue. If you’re up here already, your mic is actually on. Maybe I should have told you that. Your mic is on, and you can just, like, raise your hand and lean in and get yourself on the list. Yeah, please. If you could introduce yourself, that would be helpful.
Audience: Edmund Chung from .Asia. Very interesting panel. Thank you for bringing the topic in. I have actually two or three questions. One on more social political side, and one on more of the technical side. The first one, well, first of all, I just find the fire chat with the celebrity star Joseph very interesting, because he emphasized on your digital you belongs to you, and this is exactly what we want to talk about. So my first question is, especially to Ian, probably, besides privacy as one aspect, what about the ability to choose, you know, a choice for curation of information, and how that relates to our democratic institutions and processes and so on, because that is a big topic in my mind. In your research, does that, you know, does it help? This is my first question. The second question, related to the deletion part, that’s really interesting. I was just thinking about it. Even email doesn’t resolve that perfectly yet. But something like time-based encryption might be useful. But the issue here seems to be the forum or the standards body that the different players and providers are willing to and Chris Riley. So, it’s a question of whether or not we have a forum that we can abide by. ActivityPub is I think with W3C. Is that going forward? Is W3C the right place? Is IETF, is it a forum, does it need a forum to allow interoperability to really happen?
Mallory Knodel: ≫ I mean, maybe I can answer in reverse.
Ian Brown: ≫ I was going to say you’re more of an expert on question 1.
Mallory Knodel: ≫ Yeah, I think it’s a good question.
Ian Brown: I think it’s a good question.
Mallory Knodel: I think we’re still continuing to standardize extensions, which is exactly what you need to build on this as it grows. So, that still happens, and I think you’re right to point it out as an important piece. As far as like how you ensure different platforms are implementing it correctly and doing things like delete, you know, if requested, that’s something that standards bodies have to do. And I think that’s something that’s important to address, and I think that’s something that’s important to address.
Ian Brown: ≫ And on that issue, I would say the IETF, the internet engineering task force, which I’m much more familiar with than W3C, certainly organizes, I forget what they call them, bake-offs maybe, interoperability testing, making sure that software that is following IETF standards, first of all, deletes the part of the database where misuse happens, so you can counteract corruption at from of an explosive point of view, can drop a number of accounts on that sort that is very potentially a criminal life letter, right? Those are some of the ways that we’re somehow providing sense to other platforms. right approach to sharing their content with other platforms, but then one of those platforms went rogue, you know, started allowing people on that platform to harass them. Well, META can block the harassment, you know, crossing over onto threads, but it can’t enforce against a bad faith actor. Technically, the data still has crossed, you know, it’s on the other side. What could happen then, if necessary, if this was happening on a large scale in particular, META and all the individuals affected could actually take legal action in the 160 countries, including the EU member states that have GDPR-like privacy laws to say, hey, bad faith actor, you’re clearly ignoring my clearly stated limits on processing of my personal data. It’s often going to be sensitive personal data in GDPR terms about your political opinions, your health, a range of other very sensitive topics. And then if your data protection regulators are doing their job, or if your courts are doing their job, they can legally go after the bad faith actors.
Delara Derakhshani: I’d love to quickly respond to something as well, if that’s okay. Your digital you belongs to you. I think that’s an incredibly powerful statement, especially as increasingly our lives are online or increasingly personalization of, for example, AI systems is driving our lives. But I do want to just note that this is the very heart of why portability matters and why you should have control of your data. And with regard to your deletion point, for whatever it’s worth, those conversations are constantly happening, both at the state and increasingly at the federal level. And so I’m looking forward to seeing where those conversations go as well.
Audience: Just quickly, you need to attribute to Joseph Gordon-Levitt, not me.
Delara Derakhshani: I’m so upset that I missed him. I didn’t know he was here.
Ian Brown: He was tremendous.
Mallory Knodel: Yeah, very focused on personal data. It’s really right up your street, Dilara. But it’s recorded, no doubt. You can catch it. We have a question here. Anyone else at the table? Okay, we’ll come to you next. Go ahead.
Audience: Dominique Zelmercier, WCC. So just to quickly react on the interop question, the same way the ITF runs interop tests, WCC has also a very thorough interoperability testing process as part of our recommendation process, which could include indeed testing whether the implementation, implement deletion, whether the service does, is a separate question. I guess a specific question to the panel. So my experience from previous privacy interoperability work in WCC is that it’s really, really hard to define privacy to a level of granularity that matches the complexity of how people want their digital selves to be presented. And so what we’ve seen overall is that what gets through the standardization process is very simple signals like do or do not typically. In a world of social media, that feels way too coarse to actually express something users care about. So I don’t know how you see disentangling this problem of managing the complexity of the way people understand their social self and the need to bring interoperability in exchanging this social media content, in particular in the context of portability or other type of interoperability.
Mallory Knodel: Yeah, good question. Would you like to respond to it? Okay, after. Yeah, okay. Anyone else on the panel want to respond?
Ian Brown: I’m happy to, yeah.
Mallory Knodel: Go ahead.
Ian Brown: That’s a really great point, an important point. And I think one response to that is to emphasize what Mallory and our other two speakers already said about the importance of shared governance and building these norms together collectively, because that’s going to be much more effective than and Chris Riley. I’m going to start with you, Chris. I think the biggest challenge is that we’re seeing big platforms building 10 different versions. With the best will in the world and with very large resources as big platforms have, if they can work together, that’s going to be much better for their users than if they define them slightly differently because that’s going to confuse users. It’s going to lead to things happening that users don’t want because they might have set a privacy control based on their understanding of one thing, but they don’t want to do it because they don’t want to be a part of it.
Mallory Knodel: I think that’s a big challenge. I keep thinking of this instance layer. In the old social media, you have platform, you have users. But now we have users, instances, platforms, and multiple different iterations thereof. So this idea that the instance level can also be a place where default settings are set, where actions can be taken. I think that’s a big challenge. I think there’s a lot of work to do to make sure that your instance could choose to de-fed rate with your instance if you’re not acting in good faith, if you’re not respecting what my users want, if my users actually just want me to do that. There’s a good report that Samantha Lye and Yale Roth at the Carnegie Endowment for International Peace put out about de-fed ration and the politics of that. What does that really look like? I think that’s a big challenge.
Melinda Claybaugh: I think there’s a lot of work to do to make sure that we can customize some of the experiences that we think people will want or make as similar as possible across the federated universe. At the same time, we want to preserve the uniqueness of each of the communities. I think you made a point about algorithms earlier. Maybe you post something on threads and it goes elsewhere and it’s going to be surfaced differently or maybe not like what we were talking earlier. We take over once each community says no than it will harm you. That’s the beauty of it, in a way. You know that you have different types of community armies that can do that really. It’s harming benefits and that kind of thing. I think most of us would think that that’s not so. on different services. So it’s really this fine balance of what are the core things that we want to make sure are protected and actioned across the Fediverse, but what can we leave to to be variable and unique?
Mallory Knodel: Yeah, that’s great. Question down here. Go ahead and introduce yourself and please ask.
Audience: Hi, my name is Caspian. So what are your visions for a privacy-preserving, interoperable internet in 10 years? And what steps should we take to ensure that it exists? Maybe the ideas in 10 years.
Mallory Knodel: I love the idea. We can think backwards to what 10 years ago was like. Very proto of what we have now. And yeah, what’s next in 10 years? I think it’s a good question. Thank you for asking it. Anybody have a response?
Ian Brown: I do. If the other two don’t want to go first. Okay. I was reminded walking in the wonderful display at the entrance by the organizers of how Norway was a real pioneer in bringing the internet to Europe. And I know that in particular because I was looking for a photo of my PhD supervisor who was part of that process. Actually, he was in London and he worked with the Norwegians who are in those photos at the entrance to bring the internet from the United States. I think there were only four sites in the original ARPANET and it was very soon after that that Norway and my supervisor Peter Kirstein brought that over to Europe. So that’s going back to the 70s. So that’s a very long, that’s 50 years ago. I’m not going to even try and remember 10 years ago because it seems all a blur, but thinking 10 years forward is a great horizon. I would say I’m very optimistic what’s happening really quickly. So way before 10 years, actually, I think what’s happening with several of the projects we’ve already talked about today, with Free Our Feets, and I see Robin at the front there, is another great example of stuff that’s bringing this into reality much faster than I expected. software tools. So I’d recommend to you just because I use them personally. I have no financial interest to be clear. One is called Open Vibe that lets you, it’s a social media platform that lets you look at your Mastodon, Blue Sky threads and Nostra accounts all in one timeline. You can customize the timeline. You can adopt different recommendation algorithms as Blue Sky also lets you do. So that point is a really important one Melinda raised. We don’t want to standardize everything. That’s often a criticism of interoperability. You’re standardizing and therefore homogenizing everything. And absolutely we want to avoid that. We want to leave space for different services to compete and develop and do a better job for their users. And that includes the type of communities that they focus on, but we want to give users choice. So if a user on a service, you know, whether it’s threads, Blue Sky, Mastodon, any of these services want to say, look, I’m saying, because a lot of people on Mastodon, for example, are very, very privacy protective for a very good reason. And nothing is requiring them to open up even beyond their own instance if they choose to. And as Mallory said, defederation is the sort of nuclear option if that’s not working properly. So I think one of the things are happening really quickly. I think things are moving so fast here that it’s very, you know, I’ve done these kind of futures exercises before for the European Commission and things are moving so fast. I’m not sure what they will look like in 10 years. As a super nerd, I would say I hope social media looks like the internet at the IP layer, that it’s that configurable and flexible, you know, that it provides a layer in the middle. And then all of the stuff we’ve talked about, you know, the famous hourglass model going up to the, you know, the extra two layers at the top, the ITF t-shirt famously says financial and political, you know, all that stuff is being coordinated by some of these organizations and other people in the room, legal interoperability. So Luca Belli, my colleague from Fundus Audio Tullio Vargas is in the front and has done a lot of work on that. You need all this stuff working together. So this wasn’t this, I didn’t intend this to be a plea for the IGF to continue, but I know it’s been. potentially may not, and I think this is an example of how the IGF can add a lot of value to these debates.
Mallory Knodel: Well, you helped me complete my bingo cards. No, that’s great. And Chris, you had also a question I want to get in. We still have 10 minutes left. So for folks, I see a couple more questions that we should be able to take those as well. Go ahead.
Chris Riley: But the danger of putting the prof in front of a microphone that he didn’t know was already on. So thank you for that. And fantastic panel. Thank you for organizing it as well. I think that often when you look at this question of trying to predict forward 10 years, you also have to look back. And Ian and I worked on the Towards a Future Internet project 16 or so years ago. I remember teaching about interoperability to Melinda’s colleague, Marcus Reinisch, back at Warwick in the last century, the last millennium. And I think that one of the really important lessons that we learned back there, and to mention somebody who probably hasn’t been mentioned for a few IGFs, Lawrence Lessig talked about machine-readable code, lawyer-readable code, and human-readable code. And I had a great conversation last night. One of the reasons why IGF should continue is that you can have great conversations with government officials and engineers and lawyers and policymakers about these things. And one of the fascinating conversations was engineers who had started studying law said, wow, the legal elements of this are really quite straightforward by comparison with what we do. The DMA is very straightforward compared to what we do when we’re working on standards committees. So we don’t find this difficult at all. But the difficulty, of course, is users simply go for defaults. And so one thing I would hope would happen in 10 years’ time, going to Ian’s straw poll of the audience, is that people would find some kind of form of Fediverse that they are comfortable with using in a way where the interoperability is invisible to the user. So Ian and I often cited in the regulating code, but we cited the wonderful Paul Ohm article, the myth of the superuser, the danger of giving people not too many choices, but too many defaults which they find difficult to actually make into a privacy-preserving element. So I guess my question is for everybody, given the last year has been a great experiment in people quiet quitting Mastodon, having noisy quitted X, they’ve quiet quit Mastodon and moved to Blue Sky and a couple of other things, and thank you Ian for turning me on to OpenVibe and learning how I could actually quiet quit in a much more usable way, is how do people see the user out there who doesn’t have, you know, machine readable or lawyer readable abilities to understand these things, how do we make it easier for them to avoid the same problems which occurred, you know, we’re in the middle of a national, a natural experiment, right, where X has become something that almost everybody wants to avoid, except seemingly government official pronouncements which is a very weird thing. So making it more usable for the user, so that in 10 years time users aren’t stuck in this position of having to do more research than they ever want to do, and if I could just mention one thing, I’m now based in Melbourne, Australia, the Australian Competition and Consumer Commission has just issued their final digital platforms report, it’s their 10th over six years, so that has some ideas in it as well I hope.
Mallory Knodel: Yeah, good question, I mean for me I would answer kind of both these questions with, I don’t want to have to sign up for another service ever again, and thank you, I just want to be able to follow all the new people that that service has brought to the internet. But in the interest of time, if I could just maybe suggest that we take, I think we can only maybe take three of these questions, because we have five minutes if they’re quick, and then we’ll see if Dilara, Melinda and Erin have closing remarks, if that works.
Ian Brown: Can you do four, because there are only four people at the moment.
Mallory Knodel: Okay, let’s do, if you can all promise to make it really succinct, like we think we can manage four questions, three responses in five minutes, that math totally works out.
Audience: Yeah, go ahead. So hello, I’m Winston Xu, and I’m from One Power Foundation in Hong Kong, and I’m here, I want to ask a question as now that I’m Many users of the internet are like children. So is there anything that youth can do in the interoperability and fairness of the internet?
Mallory Knodel: Awesome question. Thank you, Wilson. Go ahead.
Audience: I wanted to ask a bit of an open-ended question about more the cultures of each platform and how taking things into a different platform may change the context and remove maybe some the culture from the first platform. I’m thinking of maybe like link sharing and screenshot sharing and how that can very easily spiral out of control. Yeah, how can that be done ethically in a interoperable system?
Mallory Knodel: That’s great. I like that one, too. Thank you for bringing it up. And back over to this side.
Audience: Hello, my name is Gabriel. So the Fediverse being decentralized, you have to deal with lots of different instances, probably hundreds. How do you ensure trust between them? I read a while ago it was found out that, for example, direct messages through Mastodon could be read through if someone was misusing the protocol. So how do you ensure that other instances that are interfaced with can be trusted? Thank you for bringing up direct messages. That’s a good one, especially for a privacy panel. Awesome. And then our last question, please. Good afternoon. Thank you. Perhaps as a question, a reflection, what takes a foot off the toes from the private sector that hopefully, now we talk about interoperability, can be shared? Resilience is about sharing of learnings of what went wrong or what went right. What will you share to the public sector? For instance, in public services, enhancing the services for portable health or for the municipalities, for the social welfare of children or families. What can be shared? For both the good and bad, I’m not focusing only on standards because they are waiting on standards only.
Ian Brown: Thank you. Thanks so much. Let’s go in this order. Ian, if you want to start, just respond to any of that that you can and then we’ll work our way down. Thank you. Incredibly briefly, and let’s keep the conversation going online afterwards. Let me pick out two, the great one from my right, your left, and the last question. So, I think mentioning uncontrolled sharing of screenshots, I think that’s a great way to think about how ActivityPub and at Proto, Blue Skies Protocol and other protocols like it, can help do controlled sharing. So, making sure opt-in, starting point, protocol features that make sure that users’ wishes as far as good faith actors interpret them technically are followed and legal backstops for bad faith actors that choose not to. So, uncontrolled sharing via screenshots is something that’s never going to go away because you can’t stop people screenshotting. That was the dream of the digital rights people in the 90s and the early 2000s. And from a technical perspective, I had a start-up doing it. I promise you, it doesn’t work. That’s a bad idea. Mallory’s way is the way. Mallory and Friends is the way. And the very last question, very simple point, and actually picks up what Chris said as well. One thing governments can do is, for goodness sake, share all your information on multiple platforms, not only on X. Don’t have politicians rail on the one hand about X and then force all your citizens to join X if they want to get information from the government.
Mallory Knodel: Hey, maybe governments should have instances where they’re the source of truth that’s verified.
Melinda Claybaugh: I think what we see is that we’re at the very beginning of this journey. And so I would just encourage, you know, meta is learning as we go. We just last week announced an in-threads feed that pulls, you know, posts from whatever else you’re connected to. That was at the response of users who wanted that. So you know, keep pushing for what you want in this Fediverse, and that will help drive the conversation.
Delara Derakhshani: Yeah, and I’ll finish by saying that a lot of DTI’s work is focused on this. I would really encourage you all to reach out to continue these conversations. The issue of trust during transfers, if a user wants to move their account or content from one service to another, how do we know that the service will respect expectations or that it was actually authorized? How do we confirm identity and integrity and making sure that the message was not lost or maliciously changed? But crucially, you know, these are things that we’re working on, and I really hope that you all reach out for further conversations.
Mallory Knodel: Thanks so much for coming. Thank you for your questions. Thank you to the panelists who spent their time thinking about these issues in depth at the Social Web Foundation and with all of you, right? We are actually really trying to have these conversations together. Like Melinda said, this is only the beginning, so hopefully we can stay engaged. Hopefully we can share learnings and make this space more robust and better. Great.
Delara Derakhshani
Speech speed
153 words per minute
Speech length
1314 words
Speech time
513 seconds
User agency requires knowing what’s being shared, when, with whom, and why – interoperability shouldn’t mean default exposure
Explanation
Derakhshani argues that successful interoperable systems must prioritize user agency by ensuring users have full knowledge and control over their data sharing. She emphasizes that interoperability should not automatically expose user data without explicit consent and understanding.
Evidence
She mentions technical design issues like real-time post visibility across services, activity portability, and social graph portability as examples of areas requiring user awareness and control.
Major discussion point
Privacy and User Control in Interoperable Systems
Topics
Human rights | Legal and regulatory
Agreed with
– Melinda Claybaugh
– Ian Brown
Agreed on
User agency and control over data sharing
Privacy-respecting interoperability demands ongoing collaboration with standards bodies and shared governance
Explanation
Derakhshani contends that achieving privacy-respecting interoperability requires continuous engagement with standards organizations and a commitment to collaborative governance structures. She views this as a cultural and procedural necessity rather than just a technical challenge.
Evidence
She references working in the open, engaging with standard bodies, and maintaining secure dialogue that reaches public IoT implementations as examples of necessary collaborative practices.
Major discussion point
Privacy and User Control in Interoperable Systems
Topics
Infrastructure | Legal and regulatory
Agreed with
– Melinda Claybaugh
– Chris Riley
Agreed on
Need for user education and clear communication
Disagreed with
– Melinda Claybaugh
Disagreed on
Implementation approach for privacy rights in federated systems
Federated ecosystems require coordination mechanisms that align responsibilities across diverse players without centralizing control
Explanation
Derakhshani argues that distributed systems need governance infrastructure that provides coordination while maintaining decentralization. She emphasizes that the goal is not to centralize control but to create mechanisms for alignment across different actors in the ecosystem.
Evidence
She cites DTI’s development of a trust model informed by real-world use cases and input from multistakeholder actors, and mentions the Digital Markets Act’s silence on trust-building mechanisms as an example of the need for such coordination.
Major discussion point
Governance and Trust Infrastructure
Topics
Legal and regulatory | Infrastructure
Agreed with
– Melinda Claybaugh
– Ian Brown
Agreed on
Technical challenges require collaborative solutions
Trust registries and verification processes can reduce duplication and streamline onboarding across platforms
Explanation
Derakhshani proposes that shared trust infrastructure can create efficiencies by allowing developers to complete verification processes once rather than repeatedly for each platform. She argues this approach leads to harmonization and efficiency while scaling to accommodate smaller platforms with limited resources.
Evidence
She mentions DTI’s launch of a trust registry pilot and their initiative called Lola designed to help users migrate safely as concrete examples of trust infrastructure implementation.
Major discussion point
Governance and Trust Infrastructure
Topics
Infrastructure | Legal and regulatory
Migration between platforms involves not just data transfer but joining new communities with different safety considerations
Explanation
Derakhshani argues that user migration in federated systems is more complex than simple data portability, as it involves users entering new community contexts with different norms and safety standards. She emphasizes that for marginalized groups especially, the safety of the destination platform is as important as the ability to transfer content.
Evidence
She references repeated feedback from DTI’s work that marginalized groups particularly value destination safety, and mentions that trust-building tools must address not just privacy but also moderation, consent, and survivability.
Major discussion point
User Experience and Education
Topics
Human rights | Sociocultural
Melinda Claybaugh
Speech speed
179 words per minute
Speech length
1052 words
Speech time
350 seconds
Users should have rights to access, correct, transfer, and delete their data, but implementation in decentralized systems is challenging
Explanation
Claybaugh acknowledges that fundamental data subject rights enshrined in GDPR and similar laws are important and expected by users, but argues that implementing these rights in federated systems presents unique challenges. She points out that while deletion works well within single platforms, ensuring deletion across federated networks is technically difficult.
Evidence
She provides the example that when a user deletes a post on Threads, Meta sends the deletion request along the chain to other platforms like Mastodon, but cannot guarantee the post is actually deleted downstream.
Major discussion point
Regulatory Framework and Compliance
Topics
Human rights | Legal and regulatory
Agreed with
– Delara Derakhshani
– Ian Brown
Agreed on
User agency and control over data sharing
Disagreed with
– Ian Brown
Disagreed on
Scope and complexity of interoperability requirements
Users need clear understanding of what it means to post content that may be shared across multiple services
Explanation
Claybaugh argues that user education is critical as federated social media proliferates, particularly for average users who may not understand the technical implications of posting content that can flow across multiple platforms. She emphasizes the need to meet users where they are and tailor explanations to different user types.
Evidence
She describes Threads’ onboarding process that uses pictures to explain what the Fediverse is and what happens when users delete content, distinguishing between power users, tech users, and average users.
Major discussion point
User Experience and Education
Topics
Human rights | Sociocultural
Agreed with
– Delara Derakhshani
– Chris Riley
Agreed on
Need for user education and clear communication
Different user types (power users vs. average users) require different levels of explanation and control options
Explanation
Claybaugh contends that federated platforms must recognize and accommodate different user sophistication levels, from technical power users to casual users just exploring new online spaces. She argues that user experience design must account for these varying needs and expectations.
Evidence
She references the need to distinguish between power users, tech users, and average users who may be ‘poking around and trying out new things online’ when designing user education and interface elements.
Major discussion point
User Experience and Education
Topics
Sociocultural | Human rights
Existing data protection regimes need fresh implementation approaches for decentralized systems
Explanation
Claybaugh argues that while fundamental privacy rights should remain intact, the implementation of these rights needs to be reconsidered for federated environments. She calls for collaboration between industry, regulators, and users to develop new norms and expectations for how privacy rights work in decentralized contexts.
Evidence
She points to the challenge of ensuring deletion requests propagate through federated networks and the need for industry norms around implementation as examples of where fresh approaches are needed.
Major discussion point
Regulatory Framework and Compliance
Topics
Legal and regulatory | Human rights
Agreed with
– Delara Derakhshani
– Ian Brown
Agreed on
Technical challenges require collaborative solutions
Disagreed with
– Delara Derakhshani
Disagreed on
Implementation approach for privacy rights in federated systems
Ian Brown
Speech speed
189 words per minute
Speech length
3618 words
Speech time
1147 seconds
Technical design must enable user mistakes and experimentation while protecting privacy, such as auto-deletion features
Explanation
Brown argues that privacy protection should enable users to try new things and make mistakes without permanent consequences. He emphasizes that auto-deletion features and similar privacy tools are essential for creating conversational spaces rather than permanent records, particularly important for learning and development.
Evidence
He provides his personal example of setting Mastodon posts to auto-delete after two months, explaining that he wants social media to be conversational rather than a ‘medium of record’ and references concerns about permanent records affecting employment, education, and government interactions.
Major discussion point
Privacy and User Control in Interoperable Systems
Topics
Human rights | Sociocultural
Disagreed with
– Audience
Disagreed on
Balance between user control and system complexity
Bridges between platforms require opt-in approaches to respect user consent and prevent unwanted exposure
Explanation
Brown advocates for opt-in mechanisms when connecting different federated platforms, arguing that users should explicitly choose to allow cross-platform interaction rather than having it enabled by default. He emphasizes this is particularly important to prevent harassment and unwanted exposure to hostile communities.
Evidence
He describes Bridgyfed as an example of proper opt-in bridge implementation and explains why many users on BlueSky and Mastodon would not want people on X to interact with them due to harassment concerns.
Major discussion point
Technical Challenges of Federated Systems
Topics
Human rights | Infrastructure
Agreed with
– Delara Derakhshani
– Melinda Claybaugh
Agreed on
User agency and control over data sharing
Different platforms have varying capabilities for features like auto-deletion, creating inconsistent user experiences
Explanation
Brown identifies a technical challenge where user preferences set on one platform may not be recognized or implemented by other platforms in a federated network. He argues this creates inconsistencies that could undermine user privacy expectations.
Evidence
He provides the specific example that while he sets Mastodon posts to auto-delete after two months, BlueSky doesn’t currently have auto-deletion features, and Bridgyfed doesn’t propagate his deletion preferences across platforms.
Major discussion point
Technical Challenges of Federated Systems
Topics
Infrastructure | Human rights
Agreed with
– Delara Derakhshani
– Melinda Claybaugh
Agreed on
Technical challenges require collaborative solutions
The Digital Markets Act includes messaging interoperability requirements for gatekeepers, with potential extension to social media services under review
Explanation
Brown explains that the DMA currently mandates interoperability for messaging services from the largest tech companies but not for social media, though this could change. He notes that civil society nearly succeeded in including social media interoperability requirements and that the law requires review every three years.
Evidence
He mentions that the DMA applies to seven gatekeeper companies and specifically references Article 7’s messaging interoperability obligations, noting that the European Commission must review extending these to social media services next year.
Major discussion point
Regulatory Framework and Compliance
Topics
Legal and regulatory | Economic
Disagreed with
– Melinda Claybaugh
Disagreed on
Scope and complexity of interoperability requirements
GDPR and similar privacy laws provide legal backstops against bad faith actors who ignore user data preferences
Explanation
Brown argues that existing privacy regulations in 160 countries provide legal mechanisms to address situations where federated platforms or bad actors ignore user privacy preferences. He contends that legal action can be taken when technical solutions fail to protect user rights.
Evidence
He references GDPR Article 22 requirements for organizations to make best efforts to inform other organizations when users withdraw consent, and mentions that legal action can be taken in courts and with data protection regulators against bad faith actors.
Major discussion point
Regulatory Framework and Compliance
Topics
Legal and regulatory | Human rights
Traditional competition law is evolving to require dominant firms to enable interoperability
Explanation
Brown argues that competition law is rapidly moving toward requiring interoperability from dominant companies, beyond specific regulatory frameworks like the DMA. He suggests this trend will have global implications as courts interpret dominance as requiring interoperability enablement.
Evidence
He cites the European Court of Justice decision in Android Auto, which he says ‘basically says dominant firms must enable interoperability for their services full stop’ or face abuse of dominance charges.
Major discussion point
Regulatory Framework and Compliance
Topics
Legal and regulatory | Economic
The vision includes users never having to sign up for new services while still being able to follow people across platforms
Explanation
Brown envisions a future where interoperability eliminates the need for users to create new accounts on different platforms while still enabling them to connect with people and content across the federated ecosystem. He sees this as the ultimate goal of interoperability efforts.
Major discussion point
Future Vision and Implementation
Topics
Infrastructure | Sociocultural
Social media should become as configurable and flexible as the internet at the IP layer
Explanation
Brown advocates for social media systems to achieve the same level of configurability and flexibility as the foundational internet protocols. He envisions social media as providing a middle layer that enables diverse applications and services while maintaining interoperability.
Evidence
He references the ‘famous hourglass model’ and mentions the IETF t-shirt that says ‘financial and political’ to illustrate how different layers of the internet stack serve different functions.
Major discussion point
Future Vision and Implementation
Topics
Infrastructure | Economic
Governments should share information across multiple platforms rather than forcing citizens to use single platforms
Explanation
Brown argues that governments should not require citizens to join specific social media platforms to access government information. He advocates for multi-platform government communication strategies that respect citizen choice and platform diversity.
Evidence
He criticizes the practice of politicians who ‘rail on the one hand about X and then force all your citizens to join X if they want to get information from the government.’
Major discussion point
Future Vision and Implementation
Topics
Legal and regulatory | Human rights
The focus should be on building tools that allow controlled sharing while preventing uncontrolled screenshot-based sharing
Explanation
Brown argues that technical protocols should focus on enabling controlled sharing mechanisms rather than trying to prevent uncontrolled sharing like screenshots, which he considers technically impossible. He advocates for protocol features that ensure good faith actors follow user preferences and legal mechanisms for bad faith actors.
Evidence
He mentions having a startup that attempted to prevent screenshotting and states ‘I promise you, it doesn’t work’ while advocating for opt-in protocol features and legal backstops instead.
Major discussion point
Future Vision and Implementation
Topics
Infrastructure | Human rights
Mallory Knodel
Speech speed
190 words per minute
Speech length
2960 words
Speech time
934 seconds
When users delete posts on one platform, deletion requests may not propagate effectively across all federated instances
Explanation
Knodel highlights a fundamental technical challenge in federated systems where user actions like deletion on one platform may not be properly executed across all connected platforms. This creates potential privacy and user control issues in decentralized environments.
Evidence
She specifically mentions ActivityPub as an example where deleting a post doesn’t necessarily mean it’s deleted everywhere in the federation.
Major discussion point
Technical Challenges of Federated Systems
Topics
Infrastructure | Human rights
Instance-level governance provides a middle layer between platforms and users for setting defaults and managing relationships
Explanation
Knodel argues that the instance layer in federated systems creates new governance opportunities that didn’t exist in traditional social media. She sees instances as entities that can set default privacy settings, make federation decisions, and act on behalf of their users’ interests.
Evidence
She describes how instances can choose to defederate with other instances based on user preferences or bad faith behavior, referencing a report by Samantha Lye and Yale Roth at Carnegie Endowment about defederation politics.
Major discussion point
Governance and Trust Infrastructure
Topics
Legal and regulatory | Sociocultural
Defederation serves as a mechanism for instances to protect their users from bad actors
Explanation
Knodel presents defederation as a protective mechanism that allows instance administrators to disconnect from other instances that don’t respect user preferences or engage in bad faith behavior. She frames this as a form of distributed governance and user protection.
Evidence
She references research by Samantha Lye and Yale Roth at the Carnegie Endowment for International Peace about defederation and its politics as supporting evidence for this governance mechanism.
Major discussion point
Governance and Trust Infrastructure
Topics
Legal and regulatory | Human rights
Audience
Speech speed
127 words per minute
Speech length
794 words
Speech time
373 seconds
Direct messages and encrypted content present particular security vulnerabilities in federated environments
Explanation
An audience member raised concerns about security vulnerabilities in federated systems, specifically noting that direct messages through platforms like Mastodon could potentially be read by those misusing the protocol. This highlights the challenge of maintaining privacy and security across distributed systems with varying levels of trustworthiness.
Evidence
The audience member mentioned reading about direct messages through Mastodon being readable when someone was misusing the protocol.
Major discussion point
Technical Challenges of Federated Systems
Topics
Cybersecurity | Human rights
The complexity of privacy controls often results in overly simple ‘do or do not’ options that don’t match user needs
Explanation
An audience member from W3C argued that standardization processes tend to produce overly simplified privacy controls that don’t capture the complexity of how people want to present their digital selves. They contend that binary privacy options are insufficient for the nuanced ways people understand their social identity online.
Evidence
The speaker referenced their experience from previous privacy interoperability work in W3C, noting that what gets through standardization processes are typically very simple signals that feel ‘way too coarse’ for social media contexts.
Major discussion point
User Experience and Education
Topics
Infrastructure | Human rights
Disagreed with
– Ian Brown
Disagreed on
Balance between user control and system complexity
W3C and IETF provide forums for developing and testing interoperability standards, including verification of proper implementation
Explanation
An audience member clarified that standards organizations like W3C have thorough interoperability testing processes as part of their recommendation development, similar to IETF’s approach. They emphasized that these organizations can test whether implementations properly handle features like deletion, though whether services actually implement these features is a separate question.
Evidence
The speaker mentioned W3C’s ‘very thorough interoperability testing process as part of our recommendation process’ and referenced IETF’s interoperability testing practices.
Major discussion point
Standards and Interoperability Testing
Topics
Infrastructure | Legal and regulatory
Chris Riley
Speech speed
194 words per minute
Speech length
545 words
Speech time
167 seconds
The goal should be making interoperability invisible to users while preserving choice and community uniqueness
Explanation
Riley argues that successful interoperability should be transparent to end users, allowing them to benefit from cross-platform connectivity without having to understand the technical complexity. He emphasizes that users typically accept defaults and shouldn’t be burdened with too many complex choices while still maintaining the diversity that makes different platforms valuable.
Evidence
He references Paul Ohm’s article ‘the myth of the superuser’ about the danger of giving people too many defaults, and mentions the natural experiment of users moving from X to Mastodon to BlueSky as evidence of user behavior patterns.
Major discussion point
Standards and Interoperability Testing
Topics
Sociocultural | Infrastructure
Agreed with
– Delara Derakhshani
– Melinda Claybaugh
Agreed on
Need for user education and clear communication
Agreements
Agreement points
User agency and control over data sharing
Speakers
– Delara Derakhshani
– Melinda Claybaugh
– Ian Brown
Arguments
User agency requires knowing what’s being shared, when, with whom, and why – interoperability shouldn’t mean default exposure
Users should have rights to access, correct, transfer, and delete their data, but implementation in decentralized systems is challenging
Bridges between platforms require opt-in approaches to respect user consent and prevent unwanted exposure
Summary
All speakers agree that users must have meaningful control over their data and how it’s shared across federated systems, with opt-in mechanisms being preferred over default exposure
Topics
Human rights | Infrastructure
Need for user education and clear communication
Speakers
– Delara Derakhshani
– Melinda Claybaugh
– Chris Riley
Arguments
Privacy-respecting interoperability demands ongoing collaboration with standards bodies and shared governance
Users need clear understanding of what it means to post content that may be shared across multiple services
The goal should be making interoperability invisible to users while preserving choice and community uniqueness
Summary
Speakers consensus that users need better education about federated systems while the complexity should be hidden through good design and clear communication
Topics
Sociocultural | Human rights
Technical challenges require collaborative solutions
Speakers
– Delara Derakhshani
– Melinda Claybaugh
– Ian Brown
Arguments
Federated ecosystems require coordination mechanisms that align responsibilities across diverse players without centralizing control
Existing data protection regimes need fresh implementation approaches for decentralized systems
Different platforms have varying capabilities for features like auto-deletion, creating inconsistent user experiences
Summary
All speakers acknowledge that technical challenges in federated systems require new collaborative approaches and cannot be solved by individual platforms alone
Topics
Infrastructure | Legal and regulatory
Similar viewpoints
Both speakers see the need for institutional mechanisms (trust registries and legal frameworks) to create accountability and efficiency in federated systems
Speakers
– Delara Derakhshani
– Ian Brown
Arguments
Trust registries and verification processes can reduce duplication and streamline onboarding across platforms
GDPR and similar privacy laws provide legal backstops against bad faith actors who ignore user data preferences
Topics
Legal and regulatory | Infrastructure
Both speakers emphasize the importance of designing systems that accommodate different user sophistication levels and allow for experimentation without permanent consequences
Speakers
– Melinda Claybaugh
– Ian Brown
Arguments
Different user types (power users vs. average users) require different levels of explanation and control options
Technical design must enable user mistakes and experimentation while protecting privacy, such as auto-deletion features
Topics
Human rights | Sociocultural
Both speakers recognize that federated systems involve complex community dynamics and governance structures beyond simple technical interoperability
Speakers
– Delara Derakhshani
– Mallory Knodel
Arguments
Migration between platforms involves not just data transfer but joining new communities with different safety considerations
Instance-level governance provides a middle layer between platforms and users for setting defaults and managing relationships
Topics
Sociocultural | Legal and regulatory
Unexpected consensus
Legal frameworks as enablers rather than barriers
Speakers
– Ian Brown
– Melinda Claybaugh
– Delara Derakhshani
Arguments
GDPR and similar privacy laws provide legal backstops against bad faith actors who ignore user data preferences
Existing data protection regimes need fresh implementation approaches for decentralized systems
Trust registries and verification processes can reduce duplication and streamline onboarding across platforms
Explanation
Despite representing different perspectives (academic, industry, and advocacy), all speakers view existing and emerging legal frameworks as supportive of interoperability goals rather than obstacles, which is unexpected given typical industry-regulation tensions
Topics
Legal and regulatory | Human rights
Complexity should be hidden from users
Speakers
– Chris Riley
– Melinda Claybaugh
– Ian Brown
Arguments
The goal should be making interoperability invisible to users while preserving choice and community uniqueness
Different user types (power users vs. average users) require different levels of explanation and control options
The vision includes users never having to sign up for new services while still being able to follow people across platforms
Explanation
There’s unexpected consensus that despite the technical complexity of federated systems, the goal should be to make interoperability transparent to users rather than educating them about technical details, which contrasts with typical tech community emphasis on user understanding
Topics
Sociocultural | Infrastructure
Overall assessment
Summary
Strong consensus exists around user agency, the need for collaborative governance, and making complex systems user-friendly. Speakers agree on fundamental principles while acknowledging implementation challenges.
Consensus level
High level of consensus on principles with constructive disagreement on implementation details. This suggests a mature field where stakeholders share common goals but are working through practical challenges, which bodes well for collaborative solutions in federated social media development.
Differences
Different viewpoints
Implementation approach for privacy rights in federated systems
Speakers
– Melinda Claybaugh
– Delara Derakhshani
Arguments
Existing data protection regimes need fresh implementation approaches for decentralized systems
Privacy-respecting interoperability demands ongoing collaboration with standards bodies and shared governance
Summary
Claybaugh focuses on adapting existing legal frameworks like GDPR for federated environments, while Derakhshani emphasizes building new collaborative governance structures and standards-based approaches from the ground up.
Topics
Legal and regulatory | Human rights
Scope and complexity of interoperability requirements
Speakers
– Ian Brown
– Melinda Claybaugh
Arguments
The Digital Markets Act includes messaging interoperability requirements for gatekeepers, with potential extension to social media services under review
Users should have rights to access, correct, transfer, and delete their data, but implementation in decentralized systems is challenging
Summary
Brown advocates for expanding regulatory interoperability requirements to social media platforms, while Claybaugh emphasizes the technical challenges and complexity of implementing such requirements, particularly questioning what interoperability should encompass beyond basic messaging.
Topics
Legal and regulatory | Infrastructure
Balance between user control and system complexity
Speakers
– Audience
– Ian Brown
Arguments
The complexity of privacy controls often results in overly simple ‘do or do not’ options that don’t match user needs
Technical design must enable user mistakes and experimentation while protecting privacy, such as auto-deletion features
Summary
The audience member argues that current privacy controls are too simplistic for complex social interactions, while Brown advocates for features like auto-deletion that prioritize user experimentation over granular control.
Topics
Human rights | Infrastructure
Unexpected differences
Role of government communication in federated systems
Speakers
– Ian Brown
Arguments
Governments should share information across multiple platforms rather than forcing citizens to use single platforms
Explanation
This was an unexpected policy recommendation that emerged during the discussion, with Brown advocating for multi-platform government communication strategies. No other speakers directly addressed this issue, making it a unique position that wasn’t debated but represents a significant policy implication.
Topics
Legal and regulatory | Human rights
Technical impossibility of preventing screenshot sharing
Speakers
– Ian Brown
Arguments
The focus should be on building tools that allow controlled sharing while preventing uncontrolled screenshot-based sharing
Explanation
Brown’s assertion that preventing screenshot sharing is technically impossible and that efforts should focus on controlled sharing mechanisms was unexpected and not challenged by other speakers, despite its significant implications for privacy protection strategies.
Topics
Infrastructure | Human rights
Overall assessment
Summary
The main areas of disagreement centered on implementation approaches for privacy rights in federated systems, the appropriate scope of regulatory interoperability requirements, and the balance between user control complexity and system usability.
Disagreement level
The level of disagreement was moderate and constructive, with speakers generally sharing similar goals but differing on methods and priorities. The disagreements reflect different professional perspectives (policy, technical, regulatory) rather than fundamental philosophical differences, suggesting that collaborative solutions are achievable through continued dialogue and experimentation.
Partial agreements
Partial agreements
Similar viewpoints
Both speakers see the need for institutional mechanisms (trust registries and legal frameworks) to create accountability and efficiency in federated systems
Speakers
– Delara Derakhshani
– Ian Brown
Arguments
Trust registries and verification processes can reduce duplication and streamline onboarding across platforms
GDPR and similar privacy laws provide legal backstops against bad faith actors who ignore user data preferences
Topics
Legal and regulatory | Infrastructure
Both speakers emphasize the importance of designing systems that accommodate different user sophistication levels and allow for experimentation without permanent consequences
Speakers
– Melinda Claybaugh
– Ian Brown
Arguments
Different user types (power users vs. average users) require different levels of explanation and control options
Technical design must enable user mistakes and experimentation while protecting privacy, such as auto-deletion features
Topics
Human rights | Sociocultural
Both speakers recognize that federated systems involve complex community dynamics and governance structures beyond simple technical interoperability
Speakers
– Delara Derakhshani
– Mallory Knodel
Arguments
Migration between platforms involves not just data transfer but joining new communities with different safety considerations
Instance-level governance provides a middle layer between platforms and users for setting defaults and managing relationships
Topics
Sociocultural | Legal and regulatory
Takeaways
Key takeaways
Privacy-preserving interoperability requires user agency – users must know what’s being shared, when, with whom, and why, with interoperability not meaning default exposure
Technical challenges exist in federated systems, particularly around data deletion propagation across instances and maintaining consistent user experiences
Existing regulatory frameworks like GDPR provide legal backstops, but need fresh implementation approaches for decentralized systems
The Digital Markets Act’s messaging interoperability requirements may be extended to social media services in upcoming reviews
Shared governance infrastructure is needed to coordinate responsibilities across diverse players without centralizing control
User education is critical as different user types require different levels of explanation and control options
Standards bodies (W3C, IETF) provide essential forums for developing and testing interoperability standards
The future vision involves making interoperability invisible to users while preserving choice and community uniqueness
Resolutions and action items
Continue conversations through the Social Web Foundation and Data Transfer Initiative for ongoing collaboration
Develop trust registries and verification processes to reduce duplication across platforms
Create shared guardrails through initiatives like DTI’s ‘Lola’ project to help users migrate safely
Governments should share information across multiple platforms rather than forcing citizens to use single platforms
Keep pushing for desired features in the Fediverse to drive platform development
Reach out to panelists and organizations for further conversations on trust during transfers and implementation
Unresolved issues
How to effectively propagate deletion requests across all federated instances
Managing the complexity of privacy controls that currently result in overly simple options
Ensuring trust between hundreds of different instances in decentralized systems
Balancing standardization with preserving unique community cultures across platforms
Addressing security vulnerabilities in direct messages and encrypted content in federated environments
Determining what level of interoperability users actually want versus what technologists envision
How to handle cultural context changes when content moves between platforms with different norms
Suggested compromises
Use opt-in approaches for bridges between platforms to respect user consent while enabling interoperability
Implement instance-level governance as a middle layer between platforms and users for managing relationships and defaults
Develop co-regulation approaches where private firms and civil society do detailed work with regulators as backstops
Create shared technical standards while allowing platforms to compete on community focus and user experience
Balance core privacy protections that should be universal across the Fediverse with features that can remain unique to individual communities
Use defederation as a nuclear option mechanism for instances to protect users while maintaining overall system openness
Thought provoking comments
Users should know what’s being shared, when, with whom. And why, and interoperability shouldn’t necessarily mean default exposure… migration is not just about exporting and importing data. It’s about joining new communities… for many users, particularly marginalized groups, the safety of the destination matters just as much as the ability to bring the content.
Speaker
Delara Derakhshani
Reason
This comment reframes interoperability from a purely technical challenge to a human-centered one, highlighting that data portability is fundamentally about community safety and user agency rather than just technical capability. It introduces the crucial insight that marginalized users face unique risks in federated systems.
Impact
This shifted the discussion from technical implementation details to user experience and safety considerations. It established the framework for later discussions about trust, governance, and the need for shared standards that prioritize user protection over technical convenience.
If say Mallory has… I delete my Mastodon posts after two months. I use a bridge… BlueSky currently does not have an option to auto delete all my posts… nor does Bridgyfed currently pick up my preference from Mastodon to auto-delete everything after two months and then apply that on the other side of the bridge.
Speaker
Ian Brown
Reason
This concrete example brilliantly illustrates the complexity of privacy preservation across federated systems. It demonstrates how user privacy preferences don’t automatically translate across platforms, revealing a fundamental gap between user expectations and technical reality.
Impact
This technical example grounded the abstract discussion in reality and led to deeper exploration of GDPR compliance challenges. It prompted Melinda’s response about the collision between new social media and existing legal frameworks, fundamentally shifting the conversation toward regulatory compliance.
I think there’s a real challenge as this new social media proliferates… they may not understand this network. They may not understand really what it means to be posting something on threads and then have it go to other services… we have to really meet people where they are and understand kind of who’s a power user, who’s a tech user, and then who’s just your average user.
Speaker
Melinda Claybaugh
Reason
This comment identifies a critical user experience challenge that could determine the success or failure of federated systems. It acknowledges that most users don’t have the technical literacy to understand the implications of interoperability, which has profound privacy and safety implications.
Impact
This observation redirected the conversation from technical solutions to user education and interface design. It influenced later discussions about defaults, user agency, and the need for systems that work invisibly for non-technical users while preserving privacy.
My experience from previous privacy interoperability work in W3C is that it’s really, really hard to define privacy to a level of granularity that matches the complexity of how people want their digital selves to be presented… what gets through the standardization process is very simple signals like do or do not typically. In a world of social media, that feels way too coarse.
Speaker
Dominique (audience member)
Reason
This comment exposes a fundamental tension between the complexity of human social behavior and the binary nature of technical standards. It challenges the panel to consider whether current approaches to privacy in federated systems are adequate for real human needs.
Impact
This question forced the panel to confront the limitations of technical solutions and led to discussions about shared governance, the role of instances as intermediaries, and the need for more sophisticated approaches to privacy that go beyond simple on/off switches.
The danger of giving people not too many choices, but too many defaults which they find difficult to actually make into a privacy-preserving element… how do we make it easier for them to avoid the same problems which occurred… where X has become something that almost everybody wants to avoid.
Speaker
Chris Riley
Reason
This comment introduces the paradox of choice in privacy settings and connects it to the real-world exodus from Twitter/X. It challenges the assumption that more user control automatically leads to better privacy outcomes, referencing the ‘myth of the superuser.’
Impact
This observation tied together multiple threads of the discussion – user education, defaults, and the practical challenges of federated systems. It prompted reflection on how to make interoperability ‘invisible’ to users while maintaining privacy, influencing the final discussions about usability and adoption.
Overall assessment
These key comments fundamentally shaped the discussion by moving it beyond technical implementation details to address the human and social dimensions of federated systems. Delara’s focus on marginalized users and community safety established a human rights framework that influenced the entire conversation. Ian’s concrete example of auto-deletion across platforms grounded abstract privacy concerns in technical reality, while Melinda’s emphasis on user education highlighted the gap between technical capability and user understanding. The audience questions, particularly about the granularity of privacy controls, challenged the panel to confront the limitations of current approaches. Together, these comments transformed what could have been a purely technical discussion into a nuanced exploration of how to build federated systems that serve real human needs while preserving privacy and safety. The discussion evolved from ‘how do we build interoperable systems?’ to ‘how do we build interoperable systems that people can actually use safely and effectively?’
Follow-up questions
How can we better educate users about what interoperability means and what happens to their data when they post across federated services?
Speaker
Melinda Claybaugh
Explanation
This is crucial because average users may not understand the network effects of posting on federated platforms and what happens when they delete content across multiple services
How can we develop technical solutions for ensuring deletion requests are honored across the entire federated network?
Speaker
Ian Brown and Mallory Knodel
Explanation
This addresses a fundamental privacy challenge where deleting a post on one platform doesn’t guarantee deletion across all federated instances that received the content
What governance mechanisms can ensure trust and coordination across thousands of diverse federated instances?
Speaker
Delara Derakhshani
Explanation
As the ecosystem scales from a few large platforms to potentially thousands of instances, new coordination mechanisms are needed to maintain trust and shared standards
How can we balance standardization for interoperability with preserving the unique characteristics of different communities and platforms?
Speaker
Melinda Claybaugh and Dominique Zelmercier
Explanation
There’s tension between creating common standards that work across platforms and allowing communities to maintain their distinct cultures and features
How can privacy controls be made granular enough to match the complexity of how people want to present their digital selves?
Speaker
Dominique Zelmercier
Explanation
Current standardization processes tend to produce simple ‘do or do not’ signals, which may be too coarse for complex social media privacy preferences
What can be learned from Meta’s user research on WhatsApp interoperability that could benefit the broader ecosystem?
Speaker
Ian Brown
Explanation
Meta has done significant user research on privacy and interoperability for WhatsApp that could inform best practices across the industry
How can we make federated social media as invisible and user-friendly as email interoperability?
Speaker
Chris Riley
Explanation
The goal is to make interoperability seamless for users who don’t want to become technical experts to use these systems effectively
How can we address the security vulnerabilities in federated systems, such as the ability to read direct messages across instances?
Speaker
Gabriel (audience member)
Explanation
Trust and security between instances is crucial, especially given reports of protocol misuse allowing unauthorized access to private communications
How can context and cultural norms be preserved when content moves between platforms with different communities?
Speaker
Audience member
Explanation
Content sharing across platforms can remove important cultural context and lead to misunderstandings or harassment
What role can young people play in shaping interoperability and internet fairness?
Speaker
Winston Xu
Explanation
Understanding how to engage youth in these technical and policy discussions is important for the future of these systems
What lessons from private sector interoperability work can be applied to public sector services like healthcare and social welfare?
Speaker
Audience member
Explanation
There may be valuable learnings that can improve public services through better data portability and interoperability
How will the European Commission’s review of the Digital Markets Act in the next year affect social media interoperability requirements?
Speaker
Ian Brown
Explanation
The DMA review could extend interoperability requirements from messaging to social media platforms, with global implications
Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.
Related event
