The International Observatory on Information and Democracy | IGF 2023 Town Hall #128

12 Oct 2023 06:00h - 07:00h UTC

Event report

Speakers and Moderators

Speakers:
  • Courtney Radsch, Director of the Center for Journalism and Liberty at the Open Markets Institute, Fellow at UCLA Institute for Technology, Law and Policy and Fellow at the Center for Democracy and Technology, US
  • Ansgar Koene, Global AI Ethics and Regulatory Leader at EY, Belgium
  • Nnenna Nwakanma, Digital Policy, Advocacy and Cooperation Strategist and former Web Chief Advocate at the Web Foundation
Moderators:
  • Michael L. BÄ…k, Executive Director of the Forum on Information and Democracy, Civil Society, Europe

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Michael L. BÄ…k

Michael L. BÄ…k suggests that democratic governments need to better equip themselves to handle the technology-based challenges to their institutions and values. He highlights the shrinking public space for dialogue due to technological disruptions and argues that self-regulation by tech companies has proven inadequate.

BÄ…k emphasises the necessity for a common understanding of the impact that technology has on institutions and values. The Forum on Information and Democracy plays a significant role in this regard. It has an observatory that provides policymakers with a systematic understanding of the situation, aiding their decision-making process.

In the upcoming year, the Forum on Information and Democracy will focus on artificial intelligence, media in the digital age, and data governance. Misinformation and disinformation will be cross-cutting themes. The organization plans to conduct working groups, research, and analysis to address these issues.

BÄ…k believes that the Forum’s structures, specifically its connection with the government while being led by civil society, can effectively address the challenges posed by technology. It operates through multi-stakeholder engagements and develops policy recommendations. While directly engaged with the government, the organization maintains its independence as it is led by civil society.

In summary, Michael L. BÄ…k argues for democratic governments to enhance their capabilities in handling technology-related challenges to institutions and values. The Forum on Information and Democracy plays a vital role in promoting a common understanding of technology’s impact and addressing key issues. Its unique structures position it effectively to navigate these challenges through multi-stakeholder engagements and policy recommendations.

Jeanette Hofmann

The discussion highlights the need for further research on the impact of disinformation on individuals, particularly in the context of government intervention and regulation. Currently, the focus primarily revolves around the production and circulation of disinformation, with little known about its actual influence on people. Therefore, there is a call for more extensive studies to better understand how disinformation affects individuals.

High-quality journalism is considered an important defence against misinformation. It is noted that countries with a healthy media environment tend to have less disinformation. However, traditional business models for journalism are struggling, partly due to changing attitudes towards news consumption. Nevertheless, there is a strong correlation between the prevalence of disinformation and the state of the media landscape in a region.

Furthermore, high-quality journalism plays a crucial role in democracy. It is regarded as a pillar of democracy and not just a means to combat disinformation, but also to maintain democratic societies. This underscores the significance of supporting and strengthening journalism for the overall health of a democracy.

In addition to examining the production and circulation of disinformation, there is a need to focus on understanding its impact on people’s minds and voting behaviors. The current focus largely neglects this aspect, highlighting the importance of conducting research in this area. Insights into how disinformation affects individuals can provide guidance for designing effective strategies to mitigate harm and protect democratic processes.

An observatory’s work is regarded as valuable in providing context and understanding manipulation and propaganda on social networks and platforms. This work can help fill existing knowledge gaps and shed light on the dynamics of disinformation in online spaces.

The discussion also emphasizes the importance of acknowledging prior research on topics such as manipulation and propaganda, building upon existing knowledge. By doing so, a more comprehensive understanding of these issues can be achieved, incorporating insights gained from research conducted as early as the 1970s.

Additionally, there is a call for comparative digital research, encouraging studies that compare different regions and contexts. This approach can provide valuable insights into the similarities and differences in the impact and spread of disinformation across various regions. However, a concern is raised regarding the lack of data from countries in Asia, Latin America, and Africa, indicating a gap in our understanding of the global dynamics of disinformation.

In conclusion, the discussion emphasizes the need for expanded research on the impact of disinformation, the importance of high-quality journalism as a defense against misinformation, the significance of understanding the impact of disinformation on individuals, the role of observatories in examining manipulation and propaganda in online spaces, and the need for comparative digital research. By addressing these areas, a more comprehensive and informed understanding of disinformation can be achieved, facilitating improved strategies to address its consequences and safeguard democratic principles.

Jhala Kakkar

In this analysis, the speakers delve into the intricacies of technology policy and internet regulation, highlighting the need for different approaches tailored to the cultural and governance contexts of different regions. They express concern that much of the technological policy thinking and academic research originates from the West and does not directly translate to the context of the global majority, including countries like India.

One of the main arguments put forward is the necessity of exploring new and innovative approaches to social media platforms. The speakers identify a false dichotomy that exists, whereby social media platforms are either accepted as they currently exist or completely rejected. They advocate for a more nuanced consideration, emphasizing the potential for alternative strategies that better align with societal needs and values.

The importance of collaborative reports from diverse bodies is emphasized in the context of re-evaluating the current approach to internet regulation. The speakers mention that India is in the process of drafting new legislation to replace a 22-year-old piece of internet regulation. They argue that a collaborative report representing global expertise, including governments, civil society, and academic organizations, is crucial for a comprehensive and well-informed approach.

The analysis also addresses the impact of artificial intelligence (AI) on societies and political campaigns. Specifically, the advent of generative AI, deep fakes, and cheap fakes is a cause for concern, as these technologies have the potential to heighten disinformation and misinformation. The speakers highlight the implications of AI for societal discourse, particularly in the context of political campaigns, where these technologies can be used to manipulate information and deceive the public.

Another critical issue raised in the analysis is the collection of personal and anonymized data by platforms. The concept of surveillance capitalism, where platforms amass extensive amounts of data and utilise it for various purposes, is deemed detrimental. The speakers express concerns about the ability of data to be used to manipulate societies and impact democratic processes. They stress the significance of addressing data governance as a pressing matter in the context of platforms collecting vast amounts of personal information.

In concluding the analysis, the speakers provide valuable insights into the complexities surrounding technology policy and internet regulation. They highlight the need for approaches that consider cultural and governance contexts, rather than basing strategies solely on Western thinking. The exploration of new approaches to social media platforms, the importance of collaborative reports, and the implications of AI and data collection by platforms are all crucial considerations. Ultimately, the analysis sheds light on the challenges and opportunities awaiting policymakers as technology continues to shape societies worldwide.

Ansgar Koene

Ansgar Koene, EY’s Global AI Ethics and Regulatory Leader, focuses on the ethical use of online data and the impact of recommender systems. He emphasises the need to understand online data from the user’s perspective and acknowledges the influential role of recommender systems in shaping the online space. He advocates for responsible and ethical use of these technologies.

Koene holds positions as a trustee at the Five Rights Foundation and as a Data and AI Ethics Advisor at Afro Leadership. He works on examining different sources of online data to guide policymakers and companies in differentiating anecdotes from well-supported evidence. This guidance will enable informed decision-making processes. Koene proposes the establishment of an observatory to collect and analyse global data on online interactions, providing valuable insights for policymaking and corporate governance.

In addition, Koene stresses the importance of amplifying the voices of young people and those outside the core economies of the US and Europe to create a fairer information ecosystem. He collaborates with the Five Rights Foundation on the “Internet in Their Own Voice” project, aiming to understand the views and needs of young people in shaping the online space. Koene believes that these groups are often overlooked, leading to decisions being made without their input. By amplifying their voices, a more equitable and inclusive information ecosystem can be achieved.

Koene highlights the significance of evidence-based policymaking and the need for clear methodologies to track progress. The observatory’s meta-studies will establish a baseline understanding of different methodologies, facilitating evidence-based policy making.

The analysis also addresses challenges posed by emerging technologies and disinformation campaigns. Generative AI presents new challenges, while social media platforms continue to be a concern. Disinformation campaigns driven by particular interest groups remain an ongoing issue. Proactive measures are necessary to mitigate these negative impacts and promote a safe and trustworthy online environment.

In summary, Ansgar Koene’s work encompasses the ethical use of online data, the impact of recommender systems, amplifying marginalized voices, evidence-based policymaking, and addressing challenges from emerging technologies and disinformation campaigns. Koene’s insights serve as a call to action for regulators, policymakers, and industry leaders to actively shape a responsible and inclusive digital landscape.

Courtney Radsch

The analysis explores the need for a comprehensive understanding of global evidence, discussing various aspects related to this topic. Firstly, it highlights the presence of information in sources such as NGO reports, books, and international organizations’ reports. However, it points out that the majority of published research comes from the Global North, potentially resulting in a lack of representation from under-represented regions and causing disparities in regional, cultural, and language understanding.

Furthermore, the analysis acknowledges the influence of funding bodies on research, shaping and limiting its scope. It emphasizes the importance of globally inclusive research, advocating for more attention to be given to under-represented regions and taking into account different languages, cultures, and political environments.

Regarding research methodology, the analysis notes a tendency to prioritize big data. While acknowledging its usefulness, it cautions against potential blind spots that may arise as a result. It argues for an approach that incorporates both qualitative and quantitative methods to gain a more comprehensive understanding.

In addition, the analysis emphasizes the need for structural considerations when examining information and media ecosystems. It suggests that historical and structural conditions and biases are often replicated, necessitating further studies on the political economy and infrastructural aspects of information flow.

A significant concern raised in the analysis is the dominance of big tech monopolies in discussions and policies. The majority of studies focus on entities such as Facebook, Google, Twitter, and WhatsApp. These monopolistic entities not only have economic dominance but also impact policies. The influence of big tech firms in shaping the research agenda through funding, access to data, and lobbying is scrutinized, raising questions about research objectivity and independence.

The analysis also advocates for the inclusion of private sector data and research findings, as they contribute to a wider scope of evidence. Private firms hold many domestic reports on data and AI, making their insights valuable in achieving a more comprehensive understanding.

Another important point emphasized is the significance of studying information flow in its entirety, including media ecosystems. While there is a predominant focus on social media in current studies, neglecting broader media ecosystems can lead to an incomplete picture. The analysis highlights the need to examine mainstream and alternative media alongside social media to gain a comprehensive understanding. It also underscores the importance of studying state-dominated or captured systems and the role of conservative talk radio in shaping information flow.

In the context of AI, large language models, and data, the analysis acknowledges the complexity of the issue, which is constantly evolving. It suggests that studying or affecting one aspect will have implications elsewhere. Additionally, access to data affects our understanding, which subsequently impacts other aspects.

To sum up, the analysis provides valuable insights into the need for a comprehensive understanding of global evidence. It emphasizes globally inclusive research, the incorporation of qualitative and quantitative methods, and structural considerations in examining information and media ecosystems. It raises concerns about big tech monopolies and advocates for the inclusion of private sector data and research findings. The analysis also highlights the significance of studying information flow holistically and addresses the complexity surrounding AI, large language models, and data.

Deborah Allen Rogers

In a recent discussion on research funding, one of the speakers presented a compelling argument challenging traditional funding models. They highlighted the undeniable inflexibility of these models, asserting that they often restrict researchers from diverting their research path mid-project. The speaker emphasised the need for a flexibility clause to be included in research funding, which would enable researchers to better adapt to new discoveries and overcome challenges that may arise during their projects. This argument shed light on the limitations of traditional funding models and ignited a broader conversation about the necessity for research funding to evolve and become more adaptable.

In another aspect of the discussion, a different speaker focused on the importance of redefining expertise in the digital age. They expressed frustration over the fact that policymakers lacking digital expertise often shape policies in the digital realm. The speaker highlighted that younger individuals, who have grown up in the digital age, may possess more digital expertise than policymakers who may be less familiar with rapidly evolving technological advancements. This observation underscored the crucial need for policymaking to be informed by individuals with relevant digital expertise, in order to ensure that policies are effective and well-suited to the digital landscape. The argument put forth by this speaker sparked a thoughtful reflection on the role of expertise and the significance of incorporating it into policymaking processes.

Lastly, a speaker raised a critique of the traditional research paradigm, specifically noting the excessive focus on past studies and minor variations in research outcomes. Drawing from their personal experience, the speaker expressed dissatisfaction with an educational system that predominantly emphasises historical research and fails to encourage a forward-thinking and design-oriented approach. This critique invited a larger conversation about the need to move away from a historical focus in research and explore new avenues that prioritize innovation and problem-solving.

Overall, this discussion highlighted several noteworthy points in relation to research funding, expertise in the digital age, and the direction of research. It shed light on the limitations of traditional funding models, compelling the consideration of a more flexible approach. Furthermore, it underscored the importance of digital expertise in shaping effective policies and the necessity of shifting away from a historical research focus towards a more forward-thinking and design-driven approach. These insights provide valuable perspectives for further exploration and potential improvements in the field of research.

Nnenna Nwakanma

Upon analysing the provided information, several key points and arguments become apparent. Firstly, it is acknowledged that information consumption is widespread and occurs in various forms across different cultures. This is likened to the consumption of bread, which varies in shape, size, and form across different societies. The positive sentiment towards this notion suggests that information is fundamental to human existence.

Democracy is the next topic explored, with an emphasis on its diverse nature. The analysis highlights that democracy can take on different characteristics depending on an individual’s circumstances, similar to how cotton can be heavy, cold, or colourful. The positive sentiment expressed towards this comparison implies that democracy can be customised and adopted in different ways to suit different needs and contexts.

Furthermore, the importance of recognising the cultural nuances and varying approaches to information and democracy is underscored. It is argued that a one-size-fits-all approach is inadequate, and understanding the complexities across continents, countries, and socio-political-economic circumstances is crucial for a comprehensive analysis. This positive stance suggests that nuanced perspectives should be considered to address inequalities and foster peace, justice, and strong institutions.

The analysis also highlights the significance of responding to the needs of governments and promoting dialogue. It is posited that catering to the requirements of governments is vital for the value and relevance of initiatives. This sentiment emphasises the importance of aligning development policies with the needs of various stakeholders, especially governments, to drive effective change.

Another key point raised is the notion that information about individuals never truly disappears, even after death. This neutral sentiment reflects the enduring impact that personal information can have and reinforces the need for data privacy and management.

The importance of understanding before regulating is expressed in the analysis. Rushing into regulation without a comprehensive understanding of the subject matter is cautioned against, as it can lead to adverse outcomes. The negative sentiment towards premature regulation highlights the potential dangers of making decisions based on panic or hype. It is argued that evidence-based decision-making is essential to ensure effective and well-informed regulation.

Additionally, Nnenna Nwakanma’s perspective on regulation is explored further. She emphasises the significance of regulating based on principles rather than specific products or companies. This positive sentiment suggests the need for a broader regulatory framework that focuses on underlying principles and values. Furthermore, Nnenna Nwakanma advocates for promoting dialogue and fostering collaboration to inform regulatory discussions, as evidenced by her experiences with software regulation and her endorsement of platforms like the Internet Governance Forum (IGF).

The analysis also highlights Nnenna Nwakanma’s positive view of the shift of power from governments and global northern media to private platforms. She appreciates the democratization of media through these platforms and gains insights into power dynamics during her visit to the Meta Campus in Menlo Park. This viewpoint implies a belief that a more balanced distribution of power benefits society and reduces inequalities.

Moreover, Nnenna Nwakanma’s philosophy of raising new leaders and prioritising humility in leadership is underscored. Her commitment to training individuals and enabling them to take the lead is highlighted as a positive sentiment. This aligns with the goal of achieving peace, justice, and strong institutions as outlined by SDG 16.

In conclusion, the extended summary provides a detailed analysis of the main points highlighted in the given data. The arguments made by various speakers shed light on the universal nature of information consumption, the diverse forms of democracy, the need for nuanced approaches, the importance of responding to government needs, the persistence of personal information, the significance of understanding before regulation, and the perspectives of Nnenna Nwakanma on regulating based on principles, the shift of power in media, and leadership development. The nuanced analysis offers valuable insights into these topics and serves as a foundation for further exploration and dialogue.

Session transcript

Michael L. BÄ…k:
Yes, you should. You guys, please come join the table. So we’re sitting next to each other before. Yes. Nice to see you again. All right. So we do have a town hall scenario. We have the hall. We’re a very tight-knit community. So please feel free to sit as close as you’d like. I think we’ll start, since we’re at 3 or 4 and some people need to leave, to get some trains. Welcome to everyone who’s here and those who are online joining us. The weather is beautiful in Kyoto. And we have all held out to the very end, one of the last sessions on the last day, of a very interesting week. Let me just, again, say thank you for sticking it out until the end. My name is Michael. I’m the new executive director for the Forum on Information and Democracy. I came into this role after a 25-year career in international development, working on democracy promotion, peace building, human rights, and then five years working in policy for a tech company. So I’ve really been able to see which one. I’ll tell you secretly. No. I worked for Meda for five years. It may not seem, looking at me, but my career has been one from the periphery looking in. My work has largely been in Southeast Asia. It’s where I’ve spent most of my life. It’s also been fascinating to see the development of technology and its applications for areas that are important to me, like democracy and human rights, evolve over that time period. And so now I’m really happy to join the forum. It’s a Paris-based organization. I’m in Bangkok still for the time being. But it’s enough about me. I’ll share a few words for those who don’t know much about the forum. I’ll share some thoughts about that. And the announcements that we’re going to make today about the work that some of the members of our steering committee for our observatory will be embarking on. As I mentioned, when I worked in democracy promotion early on, I really embraced the use of new technologies. Social media, for example, seemed to be the holy grail of democratizing people’s access to information and their ability to use that to improve their lives. And I think in the interim, in the years that have passed, we’ve seen that our democratic governments aren’t always as equipped as they need to be to address the harms that have arisen and the impacts that are being had on our institutions and our shared values. We’ve seen public space for real dialogue shrinking rather than expanding due to disruptions that technology is creating in our lives. It’s not enough to rely on companies to regulate themselves. We tried that. We were promised that that’s the best way not to stifle innovation, and it didn’t work. And so that’s why many of us have spent the last few days here in Kyoto, because we don’t want to accept the false choices that have been presented to us, or the argument somehow that working at scale. is an excuse to not always have to do the right thing. And democracies have noticed, too. In 2019, a dozen or so democratic governments came together under the International Partnership on Information and Democracy. It’s an international process outlining some principles that democratic governments strive to implement to ensure that technology serves democracy and information integrity and not solely private interests. That’s now grown to 51 states who have signed onto the partnership. Brazil was the last country to join in August, so just a few weeks ago. The partnership mandated the creation of an entity called the Forum on Information and Democracy. And so as an organization, we stand kind of in between states who give us a mandate, yet as a civil society-led entity, we are independent of states. So our board members, all 11, represent civil society organizations. Thank you. The work that we accomplish is done through multi-stakeholder engagements with experts around the world, an ever-expanding group of researchers, academics, and practitioners, some of whom we have today who I’ll introduce. And I think that the governance structure that we have is quite innovative. You know, to have a direct relationship with government where we can engage and act on recommendations around policy that we have. while working with multiple stakeholders. Our organization is focused on three key areas around evidence, policy, and collaboration. On the policy work, we conduct research and develop policy recommendations that can be acted upon by states and by civil society and by companies. That is done through working with regional and national experts from around the world. The collaboration element is our emphasis on developing value creation within our network of civic partners, academic partners, research institutes around the world to contribute from the bottom up, from the sides up, into the outputs that we’re creating so that these are not just generated by northern thinkers, but it’s informed by thinkers and practitioners from all over the world. And the last area is evidence. That’s the first one that I said, but I skipped, and that’s because that’s what today is about. Evidence is about collecting and pulling together our common understanding of what we’re facing. Our evidence work is embodied by the observatory, for which some of our steering committee members have joined us in person. We’ve all found that an element that’s much lacking in this space is a common understanding and appreciation of the impact that this technology is having on our institutions and our values in a way that’s systematic and that policymakers and others who are making important decisions can turn to and look to to inform the decisions that they have to make. The observatory is working on a regular process. of meta-analyses bringing this information together. The observatory’s architecture was developed by Professor Shoshana Zuboff, who we all know from her book and her work, as well as Angel Gurria, former Secretary General of the OECD. They spent about a year working out a governance structure and a process to make this a reality. So it’s a real honor to be able to meet our steering committee members in person. It was part of a global call for people who are experts in their field, who bring a wealth of knowledge and experience to apply it to this idea of creating this common understanding. And so after looking at more than 100 submissions of candidates, we settled on 19 people, of which we have three in person with us today and two online. And their work over this coming cycle is to oversee the production of the first output from the observatory, the first meta-analysis. Now there’s a lot to cover. There are a lot of topics and issues, and we can’t do it all at once. And so the team met recently to talk about prioritization and where the group should focus their efforts over the coming year. And those have been narrowed down to a very popular topic at this conference this week, artificial intelligence, but also about media in the digital age and data governance, as well as a cross-cutting theme of misinformation and disinformation, which is quite important. And then just lastly, going forward, this group of 19 people supported by a scientific director and rapporteurs will be soliciting information, inputs, research, conducting working group discussions and so forth to gather information to fold into this meta-analysis. So with that, I would like to introduce each of our steering committee members that we have and maybe in just two minutes share what your ambition is for the observatory, how it relates to your region of where you’re at before we go into some other questions and discussion. And I’m going to pick on Courtney first. Courtney Ratch is the director of the Center for Journalism and Liberty at the Open Market Institutes. And I think you may have other affiliations. Feel free to mention those too. And I’ll give you a couple of minutes to share.

Courtney Radsch:
Great. Thank you so much, Michael. And I’m excited to meet the other steering committee members as well. So my name is Courtney Ratch. As you said, I’m the director of the Center for Journalism and Liberty, but I have a background for the past 20 years as a journalist, a scholar, and a human rights advocate. And so I bring a wealth of experience that has really focused on the global majority or the global south and understanding how technology and policies that are often developed in the U.S. or the EU increasingly have an impact in shaping the viability of information ecosystems, human rights, and the political economy around the world. My interest in being part of this initiative is actually because I’m also involved with the International Panel on the Information Environment. And I see these as very complementary efforts to understand what evidence exists to help inform policymaking, as well as, and I think as a… importantly what evidence does not exist and how that should shape our approach to policy. And so what I would hope for this this initiative is that as we seek to harness the evidence is what do we mean by evidence? And I spent the past couple of years as a fellow at the UCLA Institute for Technology Law and Policy doing a lot of research about healthy information ecosystems again like technology policy etc and one of the things that stood out is there is a lot of information embedded in NGO reports in books that are not peer-reviewed and therefore you know not part of the domain of the IPIE that are published in reports in by international organizations etc that include empirical research on the ground qualitative and quantitative research and yet they all exist in their silos there’s so little I think conversation between them and so we actually I think we know more than we think we do but it is embedded in all these you know kind of individualized efforts and so my ambition here is to help us figure out how do we learn from the evidence that has been collected that has been developed especially through the on-the-ground experience around the world and particularly how that differs by region by country by language you know by different stakeholders etc because the vast majority of published research especially in peer-reviewed journals is from Global North and especially when you get into information environment issues like the new you know new issues of disinformation misinformation which actually have a long history in propaganda and media studies and information science how, you know, so much of that is English focused. It’s global north. It’s, a lot of it is either US or Europe. And so we don’t know what the evidence actually tells us about what is happening globally if we think about, you know, the hundreds of languages, cultures, politics, etc. So I’m really hoping that this initiative, together with the IPIE, will create a comprehensive understanding of what we know about these topics and where are the holes. And so I see you nodding, so I’m gonna also wrap up here. But I think, you know, as we get into the discussion, thinking about where there is an over-emphasis of research, maybe where we need more, you know, more attention because there has been an under-emphasis, and that we have to consider, as well, how research is funded and who funds it. So if you think, you know, for example, tech companies, when they’re funding research, they’re funding certain types of research. And so we also need to think about how what we know is shaped by who is funding us to ask certain questions and what questions are not being asked.

Michael L. BÄ…k:
And I’m also nodding because I agree with everything that you’re saying, too. So thank you very much, Courtney. I think we also have online Jala Kakkar, who’s the Executive Director at the Center for Communication Governance at the National Law University of Delhi in India. Jalak, are you with us online?

Jhala Kakkar:
Hi, this is Michael. Welcome. Wish you were here in Kyoto, but it’s wonderful to have you online. Yes, I wish I was there, too. It would have been really amazing. So, Michael, over the last decade, I’ve been working closely on technology policy issues within the Indian context. And sort of the lens I really come from is how the information environment is developing within the global majority. And it’s something that we actively explore at the Center for Communication Governance, which is an academic research institution at the National Law University in Delhi in India. And I think Courtney alluded to this, but if I can just take it further, is that a lot of the policy thinking and academic research that is being relied on is emanating from the West. And much of that does not necessarily directly translate or cannot be transposed into a global majority context, which in themselves are very heterogeneous and different contexts. And you cannot homogenize them to, you know, be one sort of monolith of an environment. So I think my aspiration for the work we’re going to do as part of this marvelous project is to be able to bring out the fact that we need a difference in approaches, perhaps in different contexts. And that may not be a one size fits all approach. There’ll be different cultural, governance, regulatory capacity contexts, and we need to take that into account. I think the second thing, and perhaps there’s drawing on what you alluded to earlier, is the fact that there’s this palpable sense of a false dichotomy. Like the only options we have is take social media platforms as they are. And the challenges we have today with them, maybe at the most we can bandaid and try to tackle some of the harms that we see or don’t have them at all. And my aspiration for the time that we spend together over the next year working on coming out with the first report is really to understand, perhaps there are many shades. in between, perhaps there are new and novel approaches. And many of those have already been sort of talked about in literature, but perhaps it’s about spotlighting those particular approaches. And then the last idea I want to leave the audience with as part of opening remarks is, for instance, in India, we are in the process of drafting a new legislation to replace a 20, 22 year old piece of legislation that regulates the internet and online platforms. And of course, tech companies have a lot of money power, a lot of access to relationships, and they have tremendous opportunity to shape the thinking that’s sort of emerging. At the same time, there’s a lot of tension between these companies and the government. And sometimes, the governments across the globe, we are seeing are experimenting with many different approaches. I think we have a unique opportunity as countries around the world are rethinking their approaches to internet regulation, platform regulation, to come out with a report that has legitimacy based on the fact that it arises from something that governments are also associated with, that has the backing and the work and the minds of many civil society and academic organizations across the globe really coming together, where it’s not a one-off report of one institution, but really a global overarching coming together of individuals with expertise working on these issues. And I think it will create a resource that we can take to many governments around the world, and which will have the sort of heft to make them sit up and take notice. So I think we have a very, very important task ahead of us. in the year ahead to live up to those aspirations because we can really impact the way this information environment globally moves forward. Thank you.

Michael L. BÄ…k:
Thank you. I absolutely agree with you that it’s an important resource. I think a living resource, much like the IPCC has been for climate change. And also your note about spotlighting research. I think on the one hand spotlighting and then shining light on who’s funding research and where that research is coming from. So perhaps next I could ask Jeanette Hoffman, who is the research director at the Alexander von Humboldt Institute for Internet and Society in Germany to add any additional thoughts on yourself and what is your ambition for being part of the steering committee?

Jeanette Hofmann:
Yeah, thank you very much, Michael. My background is in political science and I’ve done research on internet governance since the early 1990s. And my recent research focus was on democracy and digitalization. And what sort of motivates me to contribute to this body are two things. First, a focus on disinformation. I’ve noticed that most of the attention right now is on production and circulation of disinformation, but we know very, very little about its impact on people. But at the same time, governments really use this growing worries and concern about disinformation as a legitimization for intervening in the public sphere and starting to regulate. So on the one hand, I think we really need more data, we need more research. on disinformation and how it works, whether it in fact affects people’s minds and voting behavior. But there’s a second reason, and that has to do with high-quality journalism. As we all know, this is an essential pillar of democracy. I think it’s also one of the most essential means against disinformation. We can see already now that disinformation is less of a problem in countries with a healthy media environment. We had yesterday a workshop where a woman from Switzerland said, not an issue. I talked to a woman from the Irish government and she said, not an issue in our country. I would say the same about Germany. So really there is a tight link between the media landscape and disinformation. But at the same time, we can see that the sort of traditional business models of high-quality journalism is crumbling, not only because of platforms, but also because the young generation of users is developing new attitudes towards news consumption. So we need to really think hard about what that means for the future, not only for combating disinformation, but also as a sort of condition for democracy.

Michael L. BÄ…k:
Absolutely agree. And just listening to the news this morning, I think it was the Washington Post is reducing, trying to buy out 250 staff, another example of the impacts we’re seeing every day. I think the other day I heard someone was saying that newspapers are folding up in Canada at a really rapid pace. Let’s shift consonants to Côte d’Ivoire. where we have Nnene Nwakanma, who works on digital policy and is with the Advocacy and Cooperation, is it? She is a Digital Policy, Advocacy, and Cooperation Specialist, and she’s in Cote d’Ivoire. Are you with us online?

Nnenna Nwakanma:
No, I’m not. Hello, everyone, great to join you online. My name is Nnene, I come from the internet. I think that’s all the introduction I do of myself. I think I’ve worked with a lot of you on different issues, open data, open source, open government, open web over the past 25 years. And even before IGF, we’ve been around through the pre-WCIS days, we used to call them. I have been working from the advocacy point of view and mostly within civil society and for the past 10 years as the Chief Web Advocate of the World Wide Web Foundation. And two things, on one hand is information and on the other hand is democracy. I like to use visualizations, I like to use illustrations. Information is like bread. Everywhere you go in the world, there’s a sort of bread that is eaten. It could be flat, it could be long, it could be with sugar, I mean, it comes in different shapes, sizes and flavors, but every people have a kind of bread they eat. So it’s information is consumed everywhere in different shapes, sizes. these forms. We may not even have the basic ingredients. I mean, we know it’s flour, basically, that you used to make bread, but flour can be made from so many different types of cereal. And that, I think, is the same with information. Granted, we all feed on it, one way or the other. You cannot not be informed unless you are dead. Even when you are dead, information about you is still going to go out there. So, that is the nature of information. Democracy, in Africa, we say it’s like cotton, yeah? The white fiber that you use to make clothing. Now, you make clothing according to your circumstances, according to your weather. Sometimes you have heavy clothing. Sometimes you have lighter clothing. Some others are better for cold. Some others are better for the heat. Some are very colorful. Some are not. So, with those two illustrations, I would like to bring our mind to the work of the Forum and especially to the report that is coming. And that brings me to my interest. My interest is in nuances. I think Jalak has mentioned a bit about it. There is no one-size-fits-all. And my desire, my vision will be that the report and, of course, the Forum will be a real observatory, go beyond the major headlines and look at the nuances that exist across continents, across countries, across social, political, and economic circumstances, and tease out those. So, in one word, it will be nuance for me. And I’m going to be having a close look as we go along. to ensure that we are bringing out the nuances across countries. The other thing might be the needs, because it’s not just enough. You don’t build bridges in the desert because they don’t have water. They don’t have a water problem. I think Janet was saying that we need to respond to needs. You can’t come to me in the middle of the desert and tell me you have a program for bridges. No, unless you build a river first. Let’s respond to the needs of our people. Governments have needs, and once those needs are catered for, then our existence makes itself valuable. Once again, yes, we are feeding our information in different sizes, forms, and flavors. Democracy, yes, we are all constructing our view of it according to our circumstances, whether it could be heavy, it could be cold, it could be colorful, but everyone builds their democracy in their way. There is no one-size-fits-all, even in clothing. That’s why we have different sizes, in colors, in shades. Of course, needs are different, and we need to respond to this. Once again, thank you for having me, and thanks to all who have kept on. I mean, we’ve been waking up every day at 1 a.m. to be here in Kyoto. I’m in West Africa today, where it’s summer all around. Please come see us. Thank you.

Michael L. BÄ…k:
Thank you, Nana. I do hope to see you in Cote d’Ivoire. Thank you very much for that, and I’m always going to remember the analogy of building bridges in the desert. I will use that and credit you. Thank you. And lastly, of our 19 steering committee members for the observatory, we have with us EY’s global AI ethics and regulatory leader, Angsar Khun, who is based in Brussels. Please share more.

Ansgar Koene:
Thank you very much. Yes, indeed. So yes, I’m the Global AI Ethics and Regulatory Leader at EY. But I am also a trustee at the Five Rights Foundation, which is a foundation that works towards the rights of young people online. And I’m a Data and AI Ethics Advisor at Afro Leadership, a Cameroon-based pan-African NGO. Actually, about eight years ago is when my journey from being an academic doing research on computational neuroscience moved into the space of questions around data, data ethics, and the internet, recommender systems, and such. And it was really by initially thinking about computational social science. There’s all this great data online, and we’re trying to use that to understand human behavior. But pretty much before we even started that project, the research translated into, what is the ethics of using online data for means that are different from what the person was thinking about when they published that data online? And it was pretty quickly then through that project that we started looking into also the role of recommender systems, how they are shaping the space online for people, what it is that they’re actually even seeing in this space. And that was then also a project together with Five Rights Foundation, because when it comes especially to people like young people, but also, obviously, also to many outside of the core economies of the US and Europe, we have a lot that people are speaking about them without actually having spoken to them and understood from their voice. So we were working with Five Rights Foundation on a report called Internet in Their Own Voice, listening to young people, what it is that they actually want to hear about this. space. So it’s really from that kind of point of view that I think what the observatory is going to be doing is going to be very important to look at the various sources of data that are being collected, the various research that is happening in the world around how data is flowing, what people are seeing online, how people are engaging with that, questions around misinformation and disinformation, in order to be able to, especially for policymakers but also for companies that are setting up their governance frameworks around the information ecosystem, to make sure that they understand where is it anecdote versus where is it strongly supported evidence, which of these sources of evidence are the ones that they really need to be acting on and how should they be acting in order to achieve true outcomes that will be beneficial. Thank you.

Michael L. BÄ…k:
Thanks for that, and appreciate the youth angle, which is often not as present as it should be, considering how we make up our panels and the time it takes to develop expertise and so forth. So thank you for that. Now, as we think about this ambitious agenda we have for the observatory to serve this IPCC-like function, to ensure that policymakers, as Aung San mentioned, have access to the latest understanding, the state of art research that exists, and that they understand the sources of this knowledge, because people or companies may have interests in producing certain kinds of research. So the next few questions I want to ask are… kind of around to what extent the work of the observatory can move the policy needle in a way that protects our democracies, our shared values, and the integrity of our, you know, information ecosystem. And perhaps, maybe I start with Courtney, who may have to leave to catch a train. In your view, you know, what are the most important cross-cutting issues and methodological considerations that the observatory and this team should be keeping in mind and pursuing during this first cycle?

Courtney Radsch:
Thanks for that. So, I think we need to make sure that we’re looking at qualitative and quantitative and mixed methods research. I found I’m a qualitative researcher also in political science, international relations with focus on communications. And there is a tendency to privilege big data. You know, people really like to use big data, which can give us some useful insights, but also can leave gaping blind holes, especially if we think about issues of data sovereignty, of inclusion and access, and how that ends up replicating historical and structural conditions and biases, as well as, again, like the data access and connectivity issues. So, I think we need to make sure that we’re thinking multi-methodologically within the same questions. So, how do we know what we think we know, and what is the evidence base for that? Then, I think we also need to think about the different methodological paradigm or the epistemological paradigm that it’s based in. Because, again, when I did this analysis of what makes up a healthy information ecosystem, which was done for a group of. donors called the Transparency and Accountability Initiative, which I think the OECD is using to analyze its disinformation and misinformation programming against. There’s a tendency to focus on issues like media and information literacy or psychological effects. How do individuals respond to disinformation, for example, to the exclusion of structural analyses or structural investigations? And especially, there’s a lack of research into how those things are linked. And so I think we have to also interrogate where we do have a lot of evidence, but also what that means for what we don’t have evidence about. And I say that because if you look at, for example, a lot of the funding, as well as just simply the access to data and the type of platforms that are studied, I mean, so in the meta-analysis that the IPIE did of some of the disinformation peer-reviewed literature, there’s just this preponderance of investigations into a handful of platforms, specifically those that have open APIs, which links right to data, right? So you study things that you have access to. It’s a lot easier to study what’s happening on Twitter than it is what’s happening on Telegram, because it has an open API. It’s also easier to study that than going into a community to understand qualitatively how people are responding or the links, again, between kind of the political economy and the individualized responses, because that’s labor-intensive. It requires linguistic expertise. It requires money. And so I think we just want to be very careful about that, because, again, when I was doing this analysis, there’s a lot of studies in this kind of psychological realm, media effects paradigm. And part of that is because it’s also to the benefit of the big tech monopolies, which do get the vast majority of studies are about, you know, Facebook, Google, Twitter, WhatsApp to a lesser extent, lesser extent telegram, but there are many other factors and we don’t have very many studies looking at, again, like the infrastructural and political economy of the Internet and now as we go into the AI era. So I really think we have to use a cross-disciplinary, multi-methodology perspective to interrogate these issues. What my concern is, for example, is that we can study all day how people respond to disinformation on social media, but it doesn’t matter if those social media companies remain monopolistic entities with the power to dominate economically, which translates into political domination, which translates into policy, and that we’re going to see this replicated in the AI era that we’re in now. And you can see that, for example, here at the IGF, which companies are up on stage at the AI high-level panels or at the AI main sessions, you can see that who’s in the room with policymakers. For example, the United States, the Congress is holding a bunch of closed-door sessions and it’s a handful of large tech, big tech monopolies who are dominating those conversations. So we really need to pull in evidence from a much wider array and I would love to also see this group look at what has the private sector developed. I mean, there are all these consulting firms, there are lots of domestic private sector reports into these various issues around data, around AI. So what can we learn from a much wider scope of research that we’re looking into. And I think, you know, as Jeanette kind of alluded to, is not… disinformation is not just a… it’s not just the effects of disinformation, it’s not just the production, it’s also you have to look at the entire kind of life cycle and the supply of quality information, which has to do with journalism absolutely, but also with how information flows through ecosystems. Again, there’s a dominance of studies into social media without considering the broader media ecosystem, the political economy of mainstream media or powerful alternative media. Again, in the U.S., the role of conservative talk radio, or in Turkey, the role of state-dominated media, or Egypt, which is where I did my doctoral research, you know, the role of state-affiliated media, and then how is that going to impact data and access information, the production of disinformation and reception in countries where you have a state-dominated media system or a captured media system. So, you know, the challenge is that these are all incredibly complex issues, and so I think the challenge for us is to get beyond where groups have already focused and take a new and fresh lens to that and really try to get outside of the boundaries, particularly outside of the boundaries that are set by tech firms that are helping shape the research agenda through funding, through access to data, through, you know, consultations and lobbying in Brussels and Washington and make sure that we are also hearing from people on the ground in countries around the world and in Englishes other than English.

Michael L. BÄ…k:
Thanks. You know, Courtney, as you were speaking, it made me think of both a competing analogy to the bridge in the desert, of the complexity of the issue. And I’ve spent much of my life in Thailand, and we have this analogy of an elephant. They’re very big. And if you’re blindfolded and you were to touch the elephant’s tail, you would have a very specific impression of what an elephant is. You might think it’s this tiny creature. If you were touching the elephant’s nose, you would maybe think more of a reptile kind of thing. If you were only touching its body, you get this, well, massive beast. So I think the idea is that through the work of the observatory, we can get to a place where policymakers can see the whole animal of all the elements that go into it, and also it’s missing.

Courtney Radsch:
And I think the added complexity of this issue is that it is constantly in motion. This is not a static elephant. This is a dynamic system. So when you study one thing or when you affect one thing, it will have implications elsewhere. So again, as I said, the access to data will affect what you think you know about things, and then that will affect other aspects. And the AI, I think if you look at AI and you break that down again, the large language models work way better in some languages than others because of data. So it’s like multidimensionally complex, which is another challenge, I think, for us.

Michael L. BÄ…k:
Absolutely. So given that complexity and all of the issues, and Cordae’s mentioned many, many, many of them and other speakers already, I want to turn to Jalok, and maybe you can reflect a little on the priority themes of AI, media in the digital age, and data governance, and why those are important priorities to start with in this first cycle, and maybe why it’s important to you. to you, particularly from the perspective of South Asia and the global South.

Jhala Kakkar:
Thanks, Michael. I think, you know, we did have a lot of discussion as the steering group in terms of what the priority areas for the next one year should be. And given that, you know, the really the key theme that drives the work that we’re doing over here is impact of, you know, platforms on our democratic societies. And if we look at the way platforms impact our societies, it’s very much, you know, I’m probably preaching to the choir over here in terms of saying that, you know, platforms are our new public squares. They’ve privatized public discussion spaces. You know, they have serious implications for the media environment in our societies and hence it becomes very, very important to look at, you know, the media aspect as one theme. You know, AI has such significant implications and especially over the last few months, I think there’s broader appreciation of the way AI can impact our societies with generative AI really getting the spotlight over the last few months. You know, during the last US election, we saw cheap fakes and deep fakes. You know, we have several key elections coming up globally, you know, India, the UK, you know, many other countries across the global North and the global majority. And this is really going to be a playground for political parties and supporters to, you know, unlock the power of AI for political campaigns and some of it. can be good and bring more voters into the fray and reach the marginalized and vulnerable, but there are ways that these systems and technologies can be weaponized in a way that’s detrimental to democratic interests. And then really the third theme that the group has chosen to focus on is that of data governance and really what platforms are able to do and the whole idea of surveillance capitalism is based on the notion that large masses of data are being collected on individuals. They’re being used both in personal data as well as anonymized, de-identified data forms in ways that really can manipulate and have very real-world implications on the way our societies function and the way our democratic processes play out. And of course, the cross-cutting theme in all of this is disinformation, misinformation, which is flowing through these public-private squares, which is heightened by the use of AI technologies. And yeah, I think in the context of so many key general elections around the globe, I think these are great themes to be picking up and analyzing for our first report.

Michael L. BÄ…k:
Thank you very much, Jalek, for that. And your point about elections next year is a really good example of many democracies are preparing for the threats against democracy and there could be the temptation to rush into regulations very quickly. And in that vein, I wanna turn to Nenad in Côte d’Ivoire and sharing your perspective on. you know, this idea of regulations of the online space in the context of evidence scarcity or the lack of real sound evidence and the risks of kind of moving towards quick regulation and how the work of this group could help policy makers and advocates working in this space. Nina?

Nnenna Nwakanma:
Yeah, so maybe I think you’re being diplomatically correct in saying quick regulation. You are in Kyoto, I’m sitting in my house. It’s panic regulation, it’s panic regulation. That’s the correct word. I’m sitting in my house. Anybody can come and fight me here. Nothing’s gonna happen to me. But here is it, it’s hype most often, right? Over the past two years, look at what has happened on the AI landscape. Even my personal self, I’ve had to go back to school to read AI law because it’s like everywhere, everywhere. Everybody’s crazy and hyping about AI. And that for me, it’s actually a red flag. It’s a red flag because as you know, the UN is going to set up an independent AI agency. It’s setting up a high level advisory board to work on it. In the past one year, I think I’ve read about 50 AI regulatory frameworks. And I’ve co-written one for Africa myself, the Africa AI Blueprint. Everywhere, it’s coming up, it’s coming up. My yesterday’s session was on data governance as well. And I’m helping, I still advise governments on regulating digital rights and media. So here is what I think. Understand before we regulate, how does this thing work? What are the implications? And that is where evidence comes. to be, that is where the work we do comes to be, you cannot regulate what you do not understand. Unfortunately, we do this hype, we are moved on hype, we are moved on electoral deadline, by electoral deadlines, we are moved by panic and we want to do something quickly. As someone in Africa, I smile, I listen very closely to Kutni, so now you are afraid. On the media landscape, there have been big western media that has been dominating the space for so long and everybody felt it was okay for these big platforms, media platforms then to be the bearers of a discourse and send us on a bias towards democracy in a certain way. The only reason we are now being afraid is that platforms are the new media and there’s a new kid on the block and your own power is being threatened. So I think it’s a power play here and being someone in Africa, I am smiling about it because governments are no longer the biggest stakeholders in the media, global northern media, the biggest media. Now we have media in the hands of private platforms. My last trip was to the Meta Campus in Menlo Park in California and so what, it’s still panic, it’s still panic. We want to regulate because we want to keep our spares of influence, we are regulating because we are afraid, we are regulating because of hype, we are regulating because of electoral deadlines and I think that all of that does not all go well and I think that the work that we’re doing needs to like calm everybody down. What is the evidence seeing? What is he not saying? What should we be afraid of? What should we not be afraid of? So for me, it’s important to understand. Evidence is key. I’ve lived long enough, Michael, and I think you’ve been around, to know that most often we regulate products. We are rushing into, no, this is fake news. Shut down this platform. Then we’ll all be OK. That is wrong. I think that we should be building on principles, and we should be building on fair processes. And that is really very important. We’ve seen big companies. I mean, 20, 25 years ago, it was software. Software was the big deal. Everybody wanted to use open software and proprietary software. And we’re like, no. Don’t stop attacking Microsoft. Let us look at principles of openness and accountability. Let us look at inclusivity. Let us look at these key principles. So let’s not regulate products and companies. Let’s regulate processes. Let’s regulate principles. And let’s be forward-looking. So in dialogue, let me close with that. Dialogue is also very important, and that’s what brings us to the value of IGF and the forum, by the way. That’s where we have this dialogue between private, public, and civil society sectors. Dialogue, I think, is very important. We should learn that. And finally, let’s breathe. Let’s just breathe. Let’s just breathe. Panic regulation, we’ve seen that. It takes us nowhere. And let’s breathe. Let’s take a breath, and let’s look at what the evidence is. And that’s why the work we do is here. Thank you.

Michael L. BÄ…k:
Thank you very much, Nina, in the spirit of calming down and finding out what we don’t know. and filling in those gaps, maybe, and in the spirit of multi-stakeholder-ness, Angsar being a representative from the private sector, maybe you can share some of your thoughts on, you know, the importance of gathering evidence to create better, ultimately better policies that will benefit us all.

Ansgar Koene:
Sure, so I mean what Nina was alluding to, that we are in in a moment in which there is a huge rush to try to come up with the right kinds of regulations, commitments, guidelines, we’re not even sure yet whether things should be mandatory or not. Different cultures approach this from a different point of view, but one thing that is clear is it doesn’t really matter which of these approaches we decide to choose, and different countries will choose different ones, we also need to be having the tools in order to check whether or not they are being implemented correctly, and that means we need to have methodologies. We need to understand which methodology is actually going to work in order to assess compliance with, be it compliance with the regulations, say Digital Service Act, be it compliance with commitments and guidelines. For all of these we need clarity on what are the good methodologies for tracking the ability to implement these kinds of commitments and obligations, and so that’s one of the things that a kind of meta-study approach to the research that exists out there is really going to help us to understand by looking at the various different methodologies that the research community, including academia, including journalists, including also work that has been done within companies, what are the various different methodologies that are been attempted, that have been used, which of these is producing the kind of evidence that is most reproducible, most applicable for this kind of an assurance process to assess are we achieving the desired outcomes of these new policies that are being developed in the space to deal with the new challenges from generative AI, the existing challenges from social media platforms, the old challenges around disinformation that may be driven by particular interest groups, which can be governments, it can be companies, it can be other groups as well. But really, we need an understanding about what is the way in which to assess where do things stand so that we can also provide the appropriate kind of recommendations. And so the kind of meta-study that the observatory is going to be doing is going to be key to be able to achieve that kind of a baseline. Thank you.

Michael L. BÄ…k:
And I realize that we’re already a minute over, although I don’t think there’s any session after ours because I think we’re one of the last. But I’d like to just give Jeanette a moment to share a thought or two, and then if anyone has any quick questions. But perhaps, Jeanette, if you wanted to share thoughts on how you think the work of the observatory in a year from now, the kind of impact or gap that it’ll fill, and as a tool for moving us further into the future from this long perspective you have of work in the field. Yeah.

Jeanette Hofmann:
Thank you, Michael. I think there is not so much to add after what has been said. Perhaps two things I’d like to emphasize. One is that this focus on… platforms and social networks cut us off from research that has been done on very similar questions in the 70s, 80s, 90s. It’s not a new topic to discuss manipulation and propaganda, right? We were discussing these issues also in the 70s, also the question of how easy is it to manipulate people? Do they actually believe what they listen to or read? That translates into the question nowadays when you forward disinformation, do you necessarily believe in what you see or are you rather signaling belonging? Say you forward a message about that the US election has been stolen. Do you do this out of loyalty for Trump or do you believe in your message? These questions are very old and by focusing on platforms, on the one hand we develop new skills, say computational social sciences and stuff like that, but at the same time we sort of forget all the work that has been done on these questions before. So that’s one issue. We should not, as Nena pointed out, chase a new pig through the village every year, but really also ground our work on previous research and contentious outcomes. And the second thing is what is really lacking in digital research is comparative work. And I hope that this meta-study will be able to look more systematically or also encourage comparative work so that we also get data of the many countries in Asia and Latin America and Africa where not much has been done. has been done to gather data. Thank you.

Michael L. BÄ…k:
Thanks for that. And I also have just learned that there’s a question from someone who’s watching us also online. So maybe I take that one and then one more. Yeah. Because this is actually the first time I’ve seen a question online during the conference. So please let us know who you are and ask the question. Put my glasses on. I can see a hand, but. Yeah, it’s Deborah. Would you like to ask your question? We need to unmute Deborah. Yeah.

Deborah Allen Rogers:
Thank you. Thank you very much. Oh my god. This has been one of my favorite, favorite panels. And Nnenna, I was thinking so hard to try to come up with a story about the idea of expertise, the bread, the clothes, et cetera. So let me just say who I am real quick, and then I’m going to go to a question on funding. So I’m, you can see my name, Deborah Allen Rogers. I have a nonprofit in the Hague called Find Out Why. And I think some of you might know who we are now. One of the things, and I’m also from New York City. I was a clothing designer in the 80s and 90s. So when you brought up cotton and you thought and you made the analysis for clothing, and it’s not a one size fits all, that made my heart sing. Also, because we do know that. Any of us in the, I don’t know everybody’s age here, but I can tell you I’m in the above 55 crowd. OK, so here’s the reason. I’m going to go hard on the funding models. One of the things that I do, besides my nonprofit on an advisory side, is to really kind of challenge the way we fund research. Why? I got a degree in 2019 in International Studies at Fordham in New York. and the thing that amazed me was how much time we spend in a curriculum studying the past. So I wanted to acknowledge the, I’m sorry I can’t remember your name now, but the woman from Germany that I agree with you, we have to look in the past, etc. in the archive. I would look at my notes but I want to try to get this point and not take too much time. We would sit in class, me with a lot of the very young, and then me as not very young at all, and we would be reading the old reports and we celebrate these minuscule differences. And so this idea of in political science, reading this research and celebrating a 0.003 difference in something, is something that I find very problematic from a design standpoint. So I want to ask this question about, we all know, all of us here who’ve had to deal in the world of funding, know that it’s a political system. And once we get the funding, it is highly political. And if we make some changes in the middle of the funding, we risk losing the funding, right? If we’re on our research path and we’re like this isn’t working, we just hit a wall and we need to change it. But the funds are allocated for this one particular path. So my question to everybody here is, and I would love for everyone to join in this idea of redesigning the funding models to put in a flexibility clause that if when I’m doing my research, I find out I’m hitting a brick wall, I don’t have to proceed on that research path. And I get to keep the funding so that I can go on the new path that I’ve just discovered must be taken. So that’s that. And then Michael, you made a point about taking time to develop expertise. And then this was the thing I was trying to figure out, what could we, how could we quantify or frame expertise? Because I work with 15 year olds that have a lot more expertise in the digital realm than I ever could. And I’m sitting with members of European Parliament and U.S. Congress, listening to people that do not have expertise in the digital realm, set policy or tell us how the world works, et cetera. And for any of us who are a little bit counterculture, We’ve been had the world explained to us many times over the decades, and I think being a New Yorker, this is something I find very frustrating when I’ve lived already through, you know, 9-11, AIDS pandemic, global transition and manufacturing and supply chains. It’s, there’s expertise in the room, even if I come from design. I mean, if I humbly could say that myself and sometimes I do try to get a seat. So I think the redefinition of expertise, I want to hear, Nana, I hope you can come up with a fabulous illustration, because five-year-olds have more digital expertise than we do now. It’s phenomenal. We all know it, who’s seen it. They don’t have business model expertise, my last comment, but do we? Because a lot of us are, have grown up in and fed the models that are ruining many, many industries. And we celebrate minuscule differences and we archive really fabulous studies because they didn’t make the one tiny little difference to get the funding. Okay, so let’s, let’s redesign the funding models together. That’s what I really wanted to say. Thank you very much.

Michael L. BÄ…k:
Thank you, Debra. We do have to wrap up. So I think it was, Nneda was set up to answer this very quickly within a minute. And then the organizers are typing. We have to, we have to stop.

Nnenna Nwakanma:
Oh, my. So Debra, refreshing hearing from you. Here is what I’ll say. One of the reasons I chose not to go to Kyoto is that there is a time to come and a time to go. And it is only intellectual property to bow before new expertise. I think that my space in the digital IGF space can be taken by those I’ve trained. I’ve been training people for the past 15 years. And I think that the highest level of leadership is when we raise awareness. other leaders to take over from us. It is perfectly okay for other people that have trained to lead IGF processes in my place. It is perfectly okay to tell a 15-year-old, I think you know this better than myself, and let them lead. Great leaders are those who first of all raise other leaders and are humble enough to bow before new leaders. Thank you.

Michael L. BÄ…k:
Thank you, Nana. That is a perfect note to end on. Thank you so much, and thank you, Debra, for the question. I apologize that we don’t have time, but feel free to come up after. Thank you, everybody. I appreciate you joining us at the end of a great conference week. Bye-bye. Thank you.

Ansgar Koene

Speech speed

153 words per minute

Speech length

899 words

Speech time

353 secs

Courtney Radsch

Speech speed

154 words per minute

Speech length

1913 words

Speech time

745 secs

Deborah Allen Rogers

Speech speed

211 words per minute

Speech length

890 words

Speech time

253 secs

Jeanette Hofmann

Speech speed

135 words per minute

Speech length

661 words

Speech time

295 secs

Jhala Kakkar

Speech speed

156 words per minute

Speech length

1220 words

Speech time

470 secs

Michael L. BÄ…k

Speech speed

156 words per minute

Speech length

2881 words

Speech time

1107 secs

Nnenna Nwakanma

Speech speed

152 words per minute

Speech length

1762 words

Speech time

694 secs