Balancing act: advocacy with big tech in restrictive regimes | IGF 2023

9 Oct 2023 08:00h - 09:30h UTC

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Elonnai Hickok

The Global Network Initiative (GNI) is a multi-stakeholder platform that aims to promote responsible decision making in the ICT sector. It focuses on how companies respond to government mandates for content removal and access to user information. GNI advocates for strategies such as strategic litigation, petitions, investigations, rankings, and human rights impact assessments to engage with big tech.

One concerning trend highlighted by GNI is the increasing number of government mandates that impose requirements for quick content removal, local offices, proactive monitoring, data localisation, and broad company scope. GNI’s policy advocacy responds to these developments in countries like Turkey, Vietnam, and Pakistan.

GNI recognises the importance of companies adopting and adapting their policies and processes to meet the challenges of the rapidly evolving digital ecosystem. To ensure accountability, GNI assesses the implementation of its principles in the policies and processes of its member companies. The organisation uses an “improvement over time” standard to measure their progress.

Operating in different jurisdictions is complex, and GNI strives to understand the difficulties faced by companies in various contexts. It helps companies navigate these spaces while striving to protect human rights.

There is a power dynamic between civil society organisations and tech companies, where the latter often hold most of the power and information. GNI has developed a range of tools to assist civil society in engaging effectively with the tech sector. It is running an Action Coalition for Meaningful Transparency to support civil society’s advocacy efforts. Different tech companies call for a focus on different aspects, such as privacy for telecom companies and human rights-respecting community guidelines for social media companies.

To engage with tech companies effectively, civil society needs to understand the ecosystem approach and ask specific, pointed questions. It is crucial to consider the legal environment in which a company operates and engage with relevant departments, such as the legal department, trust and safety, and policy divisions.

Elonnai Hickok, Public Policy Director at GNI, highlights the restrictive provisions in regulations being introduced by countries. However, specific examples or additional information regarding these regulations were not provided.

GNI encourages Russian civil society to engage in multi-stakeholder initiatives in order to share their experiences and engage with companies. This will help foster dialogue and create a better understanding of the ground realities.

In conclusion, the GNI plays a crucial role in bridging the gap between big tech companies and civil society organisations. It promotes responsible decision making in the ICT sector, advocates for human rights, and provides tools and initiatives to assist civil society in engaging effectively with the tech sector. By coordinating their efforts and working collaboratively, civil society stakeholders can have a greater impact in shaping the digital ecosystem and protecting human rights.

Cagatay Pekyorur

META, a company, has been implementing a human rights corporate policy since March 2021. This policy is guided by United Nations (UN) principles and JNI commitments. META demonstrates its commitment to transparency and accountability by publishing annual human rights reports. These reports highlight two significant risks identified by META. The first risk is overbroad government requests on take-down, where governments ask for the removal of content that goes beyond what is necessary or justified. The second risk is overbroad or unnecessary government demands for user data, where governments request access to user data that exceeds what is proportionate or required.

To address these risks, META has developed a comprehensive process to assess government take-down requests. This process involves evaluating the legality, legitimacy, necessity, proportionality, and potential external risks associated with the request. META considers factors such as legal review, international human rights standards, and the potential risks to users’ safety and offline harm. By carefully assessing these factors, META aims to ensure that its responses to government take-down requests align with human rights principles and protect the interests of its users.

There have been concerns over amendments made to Turkish legislation, which have expanded the scope and severity of sanctions against non-compliance with government requests. Companies that fail to comply with a single take-down request or do not share user data may face significant consequences, such as blockage or throttling of their services. Previously, non-compliance resulted in monetary fines. These amendments have raised concerns about censorship, privacy, and freedom of expression.

Civil society plays a vital role in holding companies accountable and influencing public opinion against government actions that may infringe on human rights. They act as advocates for people’s rights and serve as intermediaries between government policies and users’ fundamental rights. Dialogues and collaborations between civil society, platforms, and companies are essential for understanding the regulatory framework and establishing expectations.

Rather than being in opposition, collaboration between companies and civil societies is encouraged to work together. This partnership can promote more effective engagement and address challenges collectively. Cagatay Pekyorur, representing META, expresses willingness to facilitate increased engagement between civil society and their team.

META places significant importance on managing misinformation and disinformation. They have dedicated teams working diligently to identify and address networks that spread false information. META also shares their findings with the general public to promote transparency and raise awareness about these issues.

META promptly takes action in instances of incitement of violence appearing on their platform. They emphasize the importance of freedom of opinion and expression, as well as the right to security. META supports the need to take action when such instances occur to maintain a safe and responsible online environment.

When a government’s demand is not overbroad and aligns with international human rights principles, META complies with requests for user data sharing or content removal as required by local law. This demonstrates their commitment to navigating the complex balance between legal obligations and protecting user rights.

Legislative developments, particularly in the form of cybercrime laws, can present challenges for companies like META. These laws may empower governments to censor content or force platforms to share user data. Such legislation may also target specific groups, such as LGBTQ individuals, further raising concerns about human rights violations.

Addressing challenges posed by an authoritarian regime requires collective effort. Civil society, digital platforms, and companies should collaborate instead of dictating each other’s responsibilities. This collective approach enables a more comprehensive response to protect human rights and challenge abuses of power.

In conclusion, META has implemented a human rights corporate policy guided by UN principles and JNI commitments. They have established a rigorous process to assess government take-down requests and mitigate risks. Concerns have been raised about amendments to Turkish legislation that expand sanctions for non-compliance. Civil society plays a crucial role in holding companies accountable and collaborating with platforms. META is committed to managing misinformation and taking action against incitement of violence. They comply with government demands aligned with international human rights principles. Legislative developments pose challenges, and addressing them requires collective effort.

Sarah Clarke

The analysis delves into Sarah Clarke’s discussion on the significance of advocating for digital rights in hostile environments, wherein big tech companies face pressure from authoritarian regimes. Clarke underscores recent instances in Turkey and Vietnam as examples, where popular platforms like Twitter, Meta (formerly known as Facebook), and YouTube found themselves entangled in a predicament due to government-imposed content blocking. These incidents shed light on the complex challenges faced by these tech companies as they grapple with the dilemma of either complying with restrictive orders or risking potential sanctions.

In Turkey and Vietnam, the bans on platforms and content throttling imposed by the respective governments have put companies such as Twitter, Meta, and YouTube in a difficult position. By complying with the orders, these companies risk being complicit in the suppression of freedom of expression and information. Conversely, defying the restrictive measures could result in severe penalties and sanctions imposed by the authoritarian regimes. Clarke underscores the gravity of this predicament, emphasising the arduous task faced by these tech giants in navigating such scenarios.

Furthermore, the analysis highlights Clarke’s endeavours to encourage discussion and seek solutions to address the challenges faced by digital rights advocates in restrictive regimes. As part of this effort, Clarke has organised a panel discussion involving stakeholders from different regions. The primary objective of the panel is to foster actionable strategies that can effectively tackle the issues at hand. The analysis reveals that the panel discussion includes interactive elements such as workshops and breakout sessions, which promote collaboration and enable fruitful exchanges of ideas and experiences. Through these interactions, Clarke aims to create a platform where participants can explore innovative ways of resolving the complex challenges associated with advocating for digital rights in hostile environments.

The sentiment towards the importance of digital rights advocacy in hostile environments is predominantly negative, as captured by the gravity and complexity of the situation. The analysis suggests that the challenges faced by big tech companies and the potential suppression of freedom of expression reflect the severity of the issue. However, the overall sentiment regarding Clarke’s efforts to encourage discussion and find solutions is positive. The panel discussion she has organised serves as a platform for stakeholders to come together, share knowledge, and collaborate towards addressing the challenges faced by digital rights advocates. This signifies the recognition of the need for collective action and partnerships to effectively tackle these issues.

In conclusion, Sarah Clarke’s discussion on advocating for digital rights in hostile environments sheds light on the complex challenges faced by big tech companies pressured by authoritarian regimes. The instances in Turkey and Vietnam illustrate the difficult decisions these companies must make, as compliance entails potential complicity in suppressing freedom of expression, whereas defiance may result in severe penalties. Clarke’s efforts to encourage discussion and seek solutions through the panel discussion with various stakeholders highlight the importance of collaborative efforts and actionable strategies in addressing the challenges faced by digital rights advocates. Although the sentiment towards the issue is negative due to its gravity and complexity, the sentiment regarding Clarke’s initiatives is positive, indicating the recognition of the significance of these discussions in finding solutions.

Kivilcim Ceren Buken

The analysis focuses on the relationship between online platforms and civil society, exploring the challenges related to human rights, content moderation, and business interests. One argument presented is that platforms are more responsive when requests are presented as potential business opportunities. For instance, an example is provided where Twitter hired Portuguese speakers after realizing the potential business loss during a sports event blockade in Brazil. Similarly, a change in Facebook’s attitude towards Tor when it was blocked in Iran and users accessed Facebook via Tor demonstrates this point.

Another argument highlights the negative consequences of AI-driven content moderation, particularly the removal of violent content that leads to the loss of valuable evidence. The analysis provides an example of an individual tracking bombings in Syria through YouTube videos, which are often removed due to violent content. As a solution, it is suggested to store the removed content for potential use in court cases.

Advocating from a business perspective is considered effective with online platforms, rather than solely focusing on human rights. This approach is supported by examples from Brazil and Iran, where platforms’ success in responding to business opportunities is discussed.

The analysis stresses the importance of collaboration between civil society and online platforms in addressing the various challenges faced. It acknowledges that the challenges encountered by civil society in relation to platforms are not a new phenomenon. Content moderation issues and conflicting business interests are recognized as significant challenges that need to be addressed. Therefore, it argues for civil society and platforms to work together in finding collaborative solutions.

However, it points out that platforms should not bear the burden of addressing challenges related to civil society independently. Collective action is seen as essential, emphasizing the need for collaboration and partnerships between platforms and civil society.

The analysis provides valuable insights regarding the relationship between online platforms and civil society. It showcases the importance of advocating from a business perspective to effectively engage platforms and highlights the potential loss of valuable evidence through AI-driven content moderation. The call for collaboration between civil society and platforms and the significance of collective action in addressing challenges are noteworthy observations. Overall, the analysis underscores the need for a balanced and cooperative approach to ensure the rights and interests of all stakeholders are respected in the online platform ecosystem.

Katia Mierzejewska

Civil society has long been advocating for improved human rights policies from tech companies, expressing their concerns over the years. However, there has been minimal progress on the part of tech companies, leading to frustration and weariness within civil society. This negative sentiment arises from the perceived lack of change and progress in the realm of human rights policies.

One of the main arguments put forth by civil society is that the human rights policies established by tech companies often feel like a PR tool. These policies, according to civil society, fail to effectively address the issues at hand and make a tangible impact. Consequently, civil society is calling for more action and substantive change from tech companies.

Another crucial argument highlights the need for a shift in the business models employed by tech companies. The current focus on data mining and using data for profit is seen as impeding the effectiveness of human rights policies. Civil society advocates for a shift towards business models that prioritize human rights and data protection over profit-making.

Moreover, there is growing concern about how tech platforms can address government-led propaganda and disinformation campaigns. The question arises as to how platforms can effectively react to orchestrated smear campaigns or disinformation potentially backed by governments. This observation further emphasizes the necessity for tech companies to actively address and mitigate the impact of such campaigns.

In conclusion, there is a clear consensus within civil society that tech companies need to demonstrate more significant progress and take a leading role in addressing human rights policies. The current perception is that these policies are inadequate and merely serve as a public relations tool. Additionally, a shift in business models, prioritizing human rights and data protection over data mining and profit-making, is deemed necessary. Lastly, there are concerns about how tech platforms can effectively respond to government-led propaganda and disinformation campaigns. Overall, the sentiment towards tech companies’ handling of human rights issues is negative.

Audience

In discussions surrounding tech companies like META, several key issues have arisen. One major problem is the instability of teams within these companies. In the case of META, the rapid turnover of staff can pose difficulties for civil society organisations striving to engage with them, hindering effective collaboration between tech companies and civil society.

Another significant issue is the language and context barriers in content moderation. The diverse nature of online content necessitates proper moderation to combat disinformation, hate speech, and other harmful material. However, these language and context differences often impede accurate identification and handling of such content.

The situation in Russia regarding big platforms like Meta and Google has escalated beyond previous explanations. Meta is now officially designated a terrorist organization in Russia, resulting in substantial blocks on access to Facebook, Instagram, and Twitter within the country. Despite these restrictions, civil society in Russia persists, with people continuing to use platforms like Facebook and WhatsApp, which are now part of Meta.

One argument suggests that social media platforms should comply with country-specific laws and regulations. This perspective emphasizes the responsibility of these platforms to abide by the rules of the countries in which they operate. Achieving transparency in negotiations between social media platforms and governments is seen as vital, as the current processes are viewed as lacking in transparency.

Additionally, there is a pressing demand for tech companies to take a more active role in addressing content moderation and transparency issues. The existing mechanisms for engaging with civil society activists are deemed inadequate, and there is a sense of disappointment among civil society regarding the lack of progress made over the years.

To advance the discussion, it is proposed that the focus should shift from repressive regimes to the power and influence of big tech companies. Exclusive framing, which overlooks non-repressive countries, risks narrowing the scope of the discussion and engagement process.

Concerns have been raised about Meta prioritizing revenue over human rights. While these concerns are valid, it is important to acknowledge Meta’s efforts in disrupting a network run by the Ugandan government, showing a level of commitment to human rights. Furthermore, Meta has expressed a willingness to collaborate with civil society organisations to improve and learn. The presence of a senior public policy person within Meta serves as evidence of this intention.

In conclusion, the discussions surrounding tech companies like META underscore several significant challenges and debates. These include team instability, language and context barriers, restrictions on platforms in certain countries, the role and responsibility of social media platforms, the need for improved engagement and transparency, the reframing of discussions around repressive regimes, and concerns about Meta’s focus on revenue over human rights. Despite these challenges, there is hope for progress as Meta demonstrates a willingness to collaborate with civil society and work towards improvement.

Suay Ergin-Boulougouris

The internet regulation landscape in Turkey has become a cause for concern as it poses a threat to freedom of speech. Laws in Turkey have been weaponised against free speech, with social media platforms being increasingly censored over the years. Online platforms have become the primary source of news, making this form of censorship particularly impactful.

Non-compliance with content takedown orders in Turkey now results in heavy penalties for companies. These penalties include fines, bandwidth throttling, and advertising bans. This strict regulatory environment puts additional pressure on companies to comply with content removal requests, which can further restrict freedom of speech.

To address this issue, it is crucial for Big Tech companies to develop contingency strategies that protect access to their platforms during sensitive periods. YouTube, Twitter, and Facebook have resorted to content restriction in order to avoid being blocked during election periods. Additionally, it is important for these companies to actively engage with local NGOs and conduct human rights due diligence before taking any compliance steps.

Civil society also plays a vital role in advocating for digital rights issues and engaging with governments. Coordinating with the government can help effectively communicate the importance of human rights cases. Promoting public literacy on digital rights is another crucial aspect for civil society. Furthermore, international attention and efforts on digital rights can greatly influence platform decisions.

It is argued that authoritarian actions of governments should be subject to international accountability. The lack of rule of law, independent judiciary, and stifled civil society allow authoritarian states to act with impunity. Therefore, it is suggested that digital rights should be a provision in trade agreements to hold governments accountable.

Suay Ergin-Boulougouris believes that solutions to current challenges already exist within the calls from civil society. Addressing the concerns raised by civil society can help alleviate the dire situation. Despite the restrictive regimes, Ergin-Boulougouris contemplates the possibility of civil society and big tech working together. However, it is emphasised that big tech also needs to be held accountable, and civil society should play a role in monitoring their practices.

In conclusion, Turkey’s internet regulation landscape poses a significant threat to freedom of speech. The heavy penalties imposed on companies for non-compliance with content takedown orders further restrict freedom of expression. It is essential for Big Tech companies to develop strategies to protect access during sensitive periods and engage with local NGOs. Civil society must coordinate and advocate for digital rights issues and engage with governments. Furthermore, there is a need for international accountability for authoritarian actions of governments, with digital rights provisions being incorporated into trade agreements. By addressing these issues and working together, civil society and big tech can play a crucial role in safeguarding freedom of speech.

Trinh Huu Long

The Vietnamese government has introduced legislation to exert control over the internet, forcing foreign services to store user data locally. This move has raised concerns about internet freedom and user privacy. Big tech companies, including Facebook and Google, have largely complied with government requests, compromising the rights of internet users.

One of the main concerns is the government’s high success rate in content removal and user data requests. There is a lack of independent checks on government power, allowing for potential abuse. This raises questions about government influence and the protection of civil liberties.

Civil society organizations are working with tech companies to address these concerns and advocate for human rights. They argue that big tech companies should be proactive in upholding digital and human rights, rather than bowing to authoritarian pressure.

Transparency from tech companies regarding requests for content removal is crucial. The public should have access to information about these requests to assess government censorship and its impact on freedom of expression.

Additionally, there is a need for a fair and efficient appeals process for users affected by content removal decisions. The current process is inadequate, with appeals often being ignored.

In conclusion, big tech companies need to be more accountable and proactive in upholding human rights and protecting internet freedom. Collaborative efforts between civil society organizations and tech companies have shown positive results. However, there is still work to be done to safeguard user rights in the face of increasing government control and censorship. It is crucial for big tech companies to take action and play their part in protecting user rights.

Session transcript

Sarah Clarke:
Good evening, everyone, and thank you very much for coming to this very late-in-the-day event. You’re probably all struggling with jet lag and lots of digital rights topics, but we’re very grateful that you’re joining us for Article 19’s panel on the balancing act that is advocacy with big tech in restrictive regimes. My name is Sarah Clark, and I’m the director of Article 19 for Europe and Central Asia, and we’re delighted to have some really brilliant stakeholders from across the—from all—multi-stakeholders to discuss emerging and pressing issues relating to how to conduct advocacy with big tech in authoritarian and restrictive regimes. To my right, we have Chatai Pekoror, the human rights policy manager for the Middle East, Africa, and Turkey for META. Online, we have Trinh Huu-Long, who is the co-founder and co-director of Legal Initiatives for Vietnam. To my right is Swai Ergin, who’s the program officer for Article 19 Europe and Central Asia, and then we have Elinay Hickok, the managing director of the Global Network Initiative. As you know, the digital rights landscape is transforming into a battlefield due to the escalation of government-imposed restrictions. Authoritarian regimes are exerting immense control over online content, pushing big tech companies into a dilemma, whether to comply with restrictive orders or face potential throttling and severe sanctions. And this becomes particularly problematic in contexts where online platforms serve as the only medium for dissent. Civil societies push for big tech companies to resist. undue government pressure can lead to unintended consequences such as throttling or even total inaccessibility of these platforms jeopardizing users and communities that civil society seeks to protect. The gravity and complexity of this issue has been underlined by recent cases which this panel will explore. The pre-election context in Turkey saw big tech platforms such as Twitter, Meta and YouTube caught in a predicament as the government mandated content blocking and faced with the threat of bandwidth throttling on the eve of an election in Turkey and in Vietnam these platforms opted for content censorship and in Vietnam Facebook succumbed to the content blocking in Vietnam after being throttled. So today’s panel is going to focus on fostering actionable solutions and strategies to advocate for digital rights amidst hostile environments. And just to give you a little breakdown on how today will work, it is a workshop with an interactive element so we’re going to begin with an introduction and setting the stage so we’ll have interventions from our panelists. We are then going to have a breakout session and we’re going to move around the room a little bit so we’ll have two breakout groups to discuss what you’ve been facing yourselves in terms of advocating for digital rights in restrictive regimes and then we’ll have reporting back and some final comments from the panel. So to begin we’re going to start with Sway. So Sway, could you elaborate on how the recent changes in Turkey’s internet legislation are influencing both civil society and international tech companies and what is the importance And I think it’s very important for us to be aware of the importance of these changes globally.

Suay Ergin-Boulougouris:
Thank you. Thank you, Sara. And thank you all for joining us in this critical discussion. It’s wonderful to see everyone here today, and I look forward to a fruitful discussion. As Sara stated, our workshop today is inspired by the recent censorship on social media platforms on the eve of the elections in Turkey, which exemplifies how big tech can be cornered by censorship on social media platforms. And I’m very happy to be here today to talk about Turkey’s role in international human rights law. And far from being just another country wrestling with these issues, Turkey serves as a coach in their tale. Its legislative landscape showcases how laws can be weaponized against free speech, influencing not just Turkey, but setting an alarming precedent for other nations as well. I won’t delve into all the legal complexities. But I will focus on the issue of censorship on social media platforms. As you know, censorship on social media platforms has become more restrictive over years, and this can be seen as a response to the increasing importance of online platforms in the country, because the mainstream media become more and more under the control of pro-government groups, which is at 90% now, and online, so that online platforms have become the main source of news and an essential tool for independent media to be able to do their job. So censorship on social media platforms is an offense that carries up to three years imprisonment sentence in Turkey, and while the operating environment for platforms was already hostile before these last amendments in 2022, now the companies face heavy fines, including up to 90% bandwidth throttling and advertising bans for non-compliance with a single content takedown order. Yes, that’s right. So censorship on platforms has become an offence, and while the operating environment for platforms was already hostile before these last amendments in 2022, now the companies face heavy fines, including up to 90% bandwidth throttling and advertising bans for non-compliance with a single content takedown order. And while the operating environment for platforms was already hostile before these last amendments in 2022, now the The interests of throttling are not empty ones. It is a chilling reality that can and has impeded both free expression and access to vital information in Turkey. Twitter, for example, was throttled during the aftermath of devastating earthquakes in February, compromising rescue efforts, as people were mainly using this platform to coordinate rescue operations, as well as asking for help, and sometimes even under the rubble. Yet hours after meeting with the government, and also after a huge public backlash, restrictions were lifted. So these threats are real, immediate, and carry dire consequences. And so laws have been tightened, fines have been raised, and threats of throttling are real ones. And it’s a chilling reality that can and has become reality in Turkey. So before the 14 May elections, YouTube, Twitter, and Facebook restricted access to certain content to avoid being blocked on the day of the election. This was stated both by Twitter and Meta in their public statements. And so the dilemma here is stark. Should platforms comply and censor, or risk being inaccessible on such an important day? It’s a terrible choice to have to make, and raises the question of how to counterbalance such government governmental overreach. Thanks Zoy. And what strategies do you recommend for fighting back and for how civil society can balance advocacy? So some suggestions for big tech would be they need to develop contingency plans to protect access during sensitive periods like elections. Second, as we always called on big tech to conduct human rights due diligence before taking any compliance steps as well as operating in restrictive environments. Third, big tech needs to engage actively with local NGOs and invite them for consultations. Fourth, they need to coordinate with each other. And finally, the value of transparency cannot be overstated. It can be a game-changer in settings where informal pressures and secret communications are the norm. This means full disclosure of government requests and compliance actions. For example, Twitter’s publication of government communications on the censorship ahead of the elections was a step in the right direction. And for civil society, we need to continue to advocate and coordinate on digital rights issues because pressure can really influence platform decisions. And we, civil society also, even in restrictive regimes, try to coordinate or communicate with the governments as well to communicate their asks or to make the human rights case. And what else can be done? Public literacy on digital rights, bringing international attention to these critical issues. Forums like this one offer the opportunity to align strategies and coordinate our efforts. But, even if all these strategies are implemented, can we say that access to Internet will be ensured and everyone will enjoy their right to free expression online? No, because these are band-aids to symptoms, and we can brainstorm countless solutions for the symptoms, but any progress we make will be temporary and incomplete. To genuinely solve the issue, we need to address the core issues of lack of rule of law, lack of independent judiciary, stifled civil society, and absent international accountability which allow authoritarian states to act with impunity. So for states, your diplomatic efforts must extend to digital rights, ensuring governments are held to international human rights standards. Digital rights should be provisional in trade agreements.

Sarah Clarke:
And last but not least, financial and logistical support for NGOs can go a long way in creating a robust civil society capable of fighting against repression. Thank you. Turning then to Cagatay, so ÇaÄŸatay, you’re the Human Rights Policy Manager for the Middle East, Africa, and Turkey for META. So in light of Suay’s insights, can you share META’s perspective and experiences in Turkey in particular during the recent election time?

Cagatay Pekyorur:
Of course, I’ll try my best. First of all, I would like to start by thanking you for inviting me to this panel, like genuinely I’m so happy to be here and contributing to this very valuable debate that you are leading. And also, so I thank you so much for like putting everything out there so bluntly. I cannot agree more, literally everything that you said, including like how international fora should react to this challenge. I want to start by explaining like how in META we are approaching. in instances when we need to try to find a balance between government requests and also human rights. So we have our human rights corporate policy since March 2021, which clearly states that as META we are bound by United Nations guiding principles and also it talks about our JNI commitments, which goes way beyond 2021, but it still underlines it. And I think that plays an important role in specific instance, because I’m sure like many people do know, but like I want to repeat it, like JNI principles are very clear. It says that like ICT companies like us, we should comply with all applicable laws. It includes local law as well, but also respect internationally recognized human rights. And if they contradict with each other, and in many cases we do see it, unfortunately, we should avoid, minimize or otherwise address the adverse impact of government demands, laws of regulations, and seek ways to honor principles of internationally recognized human rights to the greatest extent possible. I think like, even though like it’s like a one big statement, you see where it’s directing the companies. And as part of our corporate human rights policy, we started to publish human rights reports, like we just published the second one for the year 2022. It talks about the comprehensive human rights, salient risk assessment that we made in again 2022. In that one, if you like dig into it, you can see it only talks about, it only in two instances where we are faced with a government’s direct demands which may impact human rights of our users. The first one is the overbroad government requests on take-down requests. And the second one is overbroad or unnecessary government demands for user data. The first one is very obvious, like it relates to freedom of opinion and expression. The second one, it’s about like right to privacy. But when we think about the way that governments are using this data, actually it’s again related to the voice of our users. Because like they want to get user data if it’s an overbroad or unnecessary demand. In some cases to silence political opposition. So I think both are connected with the things that you mentioned for Turkey. And I want to give you like some brief information about how we are dealing when we receive these requests from governments, especially take-down requests. You may find more detailed information in our transparency center. You can click on any country. At the bottom, there is more information about like how we are approaching this globally. On that page, there are lots of like sections. There is a section called the life cycle of TDRs, which has all the information about it with all the details which I won’t be able to get in today. But very briefly, when we receive a request from a government or an institution that has powers to make this type of a request, we first review it with our community standards. If the content is against our community standards, we remove it. If not, we continue in the process. We conduct a legal review. That legal review, it’s independent. like we are doing a legal review independently. It’s not only about the procedural legality of the decision, but it also makes an assessment if it’s legal in the local law or local legislation. At this stage, we may reject the request, actually. But if we haven’t rejected, then we move to the human rights assessment that we conduct. And there, we have another legality assessment, which also considers international human rights standards. Then we check legitimacy, necessity, proportionality. These are all the things that are part of the international human rights standards when we think about freedom of opinion and expression. But we also need to consider external risks. And I think that’s the most important part in this debate. Because when we talk about external risks, it may include several things, including our salient human rights risks, but also risk of blocking, throttling of methods, technologies, penalties that we may face, regulatory actions, criminal proceedings that we may face, our employees’ security and safety, and also offline harm that may risk to our users. So we need to take all these things into consideration before making a decision if we are going to comply or not. And as Soye explained, like Turkish legislation, it’s there since 2007. And it has been amended several times. With each amendment, both the scope of the legislation expanded, but also the sanctions became more and more severe. And at this stage where we are, if companies don’t comply with one takedown request, we may may face 90 per cent blocking, throttling, just even like for one piece of content. The similar sanction goes for not sharing user data. In the previous version, it was just a monetary fine for the same non-compliance action. So these changes in the legislation, as you may imagine, affecting the way how we are calculating the external risks as well. That’s a brief introduction that I can make, but I can be more specific about the Turkish issue if you want me to. Yes, we would love that.

Sarah Clarke:
And we’d also love to know what you think are the opportunities for civil society to influence measures, policies in authoritarian regimes.

Cagatay Pekyorur:
That’s a very good question. I think like I’m looking to this, I’m going to call it an ecosystem. Maybe it’s the wrong wording, I’m sorry. But like in the most basic structure that we are in, there is the government, there is civil society, and in authoritarian regimes, most of the times citizens have only social media to express their political opinions. In some, obviously, it’s not just social media in many countries, but like in most of the most authoritarian regimes, it is. And in that structure, actually, in my opinion, companies are almost in an intermediary position between the government’s policies and our users’ fundamental rights. Like that’s what we are trying to intermediate, like trying to find a way in between there. And of course, in that structure, what civil society is doing is you are the protectors of these people’s rights. I think there are many things that you can do in this one. The most obvious one, I think, like I’m really… going to state the obvious, you can work towards making us more accountable. But also, I think there are many other ways. The second one I can mention is, you can work towards creating a more strong public opinion against these actions of the government, to delegitimize it. I think the governments also do think about this, if their actions are legitimate in their country, so they are also taking this into consideration. And the third thing that I can suggest, as you mentioned, having open dialogue with platforms, explaining what you are expecting from us, what are your red lines for us to understand. Because as we both mentioned, the regulatory framework is getting stricter, and we may not be able to do the same as we were, because of the sanctions increasing external risk. But it’s still so valuable for us to understand what is the most important thing for civil society. My last suggestion is, in this structure that I mentioned, I think we don’t have to be face-to-face all the time, I think we can also be side-to-side, so I think there’s a room for a lot more collaboration in many different areas around this, of course. Follow-up, one final question there is, could you share an instance where MESA has had to comply with or resist a government’s request in Turkey, and what are the factors that you considered? Of course, I think I will explain twice examples, like… One day before the election, we received a take-down request from the Turkish authorities. And we complied with it. And we actually made a live transparency reporting on our action. And in that one, we explained a number of the content that we are complying to Joe Block, and also mentioned who posted it. And also, we provided a brief explanation on what the content was about. It was about corruption allegations about the government. And the reason that we end up in this situation, reasoning that I can share with you, we also mentioned it in our transparency reporting. In the law, if we don’t comply within four hours, they may throttle us, as I mentioned, 90%. But this is the risk on the paper. This is not how we evaluate the risk. We actually try to evaluate the real risk, if it’s really going to happen or not. And the background story here, for Turkish elections, we were engaging with civil society for a long time. And because of what happened during the earthquake time, the throttling of Twitter, and amendments to the internet law, there were concerns around throttling during the election period. And we were also hearing from civil society that our products are most valuable on the election day, specifically, for the civil society who are working on election integrity-related issues. So we understood that it was a top priority. This content, even though it’s related to democratic institutions, it was not directly. related to civil society’s efforts on election integrity. And it was one day before the election, as I mentioned. So we thought, what is more important there? And like, we concluded that the risk is real, of throttling risk, because it happened very recently. And we also thought like, the civil society will need us next day. But because the topic had an importance for democratic institutions, as I mentioned, we did choose the way of doing a live transparency reporting, rather than waiting for our biannual reporting.

Sarah Clarke:
Fascinating, and I remember those days really well. And I remember the feeling the day before the election in Turkey, that if we had a contested election result on a throttled web, just how difficult that would have been. So I think we all remember how fraught that time was. We’re now going to turn to Trinh Huu Long, who’s the co-founder and co-director of Legal Initiatives for Vietnam. And Trinh is joining us online. So. Thank you, may I ask, can you hear me clearly? Yes, we can hear you. Welcome. That’s great. Welcome, and we’re delighted you can join us. And also a big welcome to everyone who’s listening and watching online. So in Vietnam, you have a different set of challenges to Turkey when it comes to internet freedom. So Long, I wonder, could you shed light on the current stage of digital rights in Vietnam and how they compare to the broader Southeast Asian context?

Trinh Huu Long:
Yeah, thank you very much for having me. I have a presentation. May I share my screen or? Yeah, I think so. Great. I’m finding my screen. Please bear with me. Super. Yeah, we can see it now. Thank you very much for having me. I regret I’m not able to be there with everyone. But talking about big tech and human rights in Vietnam, I have been following this issue for a long time. If you read Freedom House, Freedom on the Net report, I’m the author of the Vietnam chapter. This is a little bit of my background. And talking about Vietnam and the Internet, I would like to show you these two books. The one on the left is called The Generation F. And by F, the authors mean Facebook. And on the right, it is a book called From Facebook Down to the Streets. So these two books have been published in 2011 and 2016 to talk about an entirely new social movement in Vietnam initiated on Facebook. And for at least five years, from 2011 to 2016, Facebook and also Google played a very central role in social mobilization. organization in Vietnam, and they were the very good allies and partners and friends of activists and dissidents and the social movements in Vietnam. But then 10 years later, up until 2023, this happened. So Facebook is now having a list of Vietnamese communist officials immune to criticism on their platform, meaning that we cannot criticize a lot of Vietnamese communist party members on Facebook. And this was revealed just a few months earlier by the Washington Post. So why, how does it come from being a friend to the movement to be in such a very problematic platform for all of us in Vietnam? So to understand the internet freedom, we need to put it in the political context that Vietnam is an authoritarian country. It is a closed society with one party ruling the country for 70 years, over 70 years already. And on all kinds of press freedom and freedom index and freedom on the next index, Vietnam is ranked somewhere in the bottom. And in terms of internet freedom, we are not free, only 22 out of 100. And on press freedom, I can say that the most recent RSF report ranked Vietnam just above two countries, which are North Korea and China. So the way the authoritarian regime works is to control the flow of information, right? The internet broke that control a long time ago, and that was why we were able. to have a lot of protest movements and social movements and election campaigns over the past more than 10 years. And this worries the government. Also, Vietnam is a big digital economy that a lot of big tech companies are trying to get in and trying to make a benefit. So we have a billion markets and it’s the fastest growing market in Southeast Asia. A very big market like that is dominated by foreign services, Facebook, Google, Netflix, with some very rare exception of domestic services such as Zillow. And then we have, since the internet has been such a threat, the government have introduced a lot of pieces of legislation to control the internet. So we are not different from the global trends that governments have been trying to make, have been following two strategies, criminalization and data localization, using the concepts of internet sovereignty and data sovereignty. So the Vietnamese government have been putting a lot of people in prison using the penal code. And then we want to shed light on the cybersecurity law in 2018, which for the first time in the history of Vietnam, that forced foreign services to store user data locally. And foreign services have to open local office of branches in Vietnam. And then The government has been aggressively introducing a lot of laws and regulations to enforce the cybersecurity law. The tendency here is that the laws are all vague and broadly defined, right? So the government can interpret the law and regulations any way they want. And even the fact that there is absolutely no independent oversight in Vietnam, there’s no independent court, no national assembly, the Congress is not independent from the government to really keep the government accountable. The law and regulations can be interpreted in any way. And then another tendency is that the law has been mainly targeting foreign services such as Facebook and Google. The reason is that these services have been dominating the market of Vietnam, right? So they have the biggest influence. Domestic services have been very weak, so they have not been able to have a big market share. Also, domestic companies are absolutely under control of the government. So the main targets of these laws and regulations are Facebook and Google, right? We have, since 2016 or 2017, something happened that the government put a lot of effort in forcing Facebook and Google to comply with their request, and for a request for content removal and user data. And Facebook and Google have been… complying with most of the government requests, with the rates up to 95%, and the government considered this a very successful strategy to force foreign companies to obey with local law. And you can see that among 95% of requests from Vietnam approved by Google, it’s about government criticism. Not only so, Netflix has been removing some, not a lot, but some dramas and movies from their platform under the pressure of the government. So the government, how did they do that? They target advertisers in Vietnam. So they go after the Vietnamese companies who advertise on Facebook and Google, forcing these companies to review their advertisements on Facebook and Google. And under this pressure, of course, Facebook and Google are losing revenue, and they have to do something to get back their customers in Vietnam. And also, the government go after the hardware manufacturers, for example, Samsung or LG, forcing them to remove the YouTube and Netflix buttons from the remote control. And dramatically, Facebook agreed to cancel posts from Vietnam after being mostly blocked in Vietnam for four months back in 2020. This was dramatic. And since then, we have seen a lot more political content on Facebook being removed or blocked. And as I said, they have a list of Vietnamese government officials, a moon from criticism. And then Facebook and Google, they are having servers in Vietnam, but mostly public server. And they have complied with some of the Vietnamese government’s request for user data. These companies have great tolerance of Vietnamese Internet trolls, mostly run by governments, such as the army, the police, and the propaganda departments of Vietnam. And we have seen these three forces manipulating the Internet environment in Vietnam in a massive scale. And they have been very successful in redirecting online conversations from sensitive issues over the past few years. But we have some good, some successful cases of advocacy. So let me show you two cases when we are successful advocates for some meaningful changes. So back in 2018, we had the new cybersecurity law. And back then, we petitioned Facebook, asking them if they store users’ data in Vietnam, or if they would comply with the content removal request from the Vietnamese government. And then this happened about a year later, that Facebook and other tech companies actively lobbied the Vietnamese government to remove the data localization requirements from the cybersecurity law. And guess what? Last year, the government issued a decree, effectively removed the hard requirements. for storing data locally, meaning that foreign services are not required to store data locally anymore. But if they don’t comply with the government’s request on data content removal and user data, they will be ordered to store data locally. So this came from a hard requirement to a soft requirement now. Case number two, it was a very successful advocacy and investigation by a group of friends of mine, activist Mai Khoi, and she’s a singer as well, and a group of other advocates. So they investigated a government-funded cyberchute on Facebook called E47, and this group targeting dissidents. And when the investigation got published on the Intercept in December 2020, Facebook quickly took action and removed these accounts from their platform. So these are two very successful cases, and we think that we can find common ground, common interests with technology companies, foreign technology companies, to make the Internet fairer and better. To be honest, I think that big tech companies, they don’t need more lectures or advice, that they have to be more friendly with all kind of human rights standards. They all know these values. They all know the cost of upholding the human rights values in the face of authoritarian regimes around the world. They all know that they can make a lot more money if they comply with these authoritarian and expressive laws and regulations. They know everything. So I think it’s time for them to take actions. It has been a long time that end users and civil societies sacrificing their digital rights. And Facebook and Google and other technology companies have been benefiting from that. It’s time to change. And if these companies are serious about upholding human rights, they must do something different. They cannot just remain the same and keep asking civil society organizations to have a dialogue with them. What’s the point of having a dialogue without taking any meaningful action eventually? So these are my recommendations for technology companies. And I look forward to more questions and discussions from the room. Thank you very much for your time.

Sarah Clarke:
Thank you, Long. That was a fascinating presentation. Sorry, checking the sound. There we are. Thank you, Long. That was, I think, particularly interesting to see just how many similarities there are between what’s happening in reality in Vietnam and in Turkey. And a lot of your solutions and recommendations really mirror Sway’s earlier in terms of what we’re asking from the tech companies. So I’m going to ask you to stay with us as we continue on. And we’re going to go now to Elinay Hickok, who’s the managing director of the Global Network Initiative, of which Article 19 is a member. And we’re delighted to be working with you. So Elinay, as the managing director of GNI, you have a bird’s eye view. of the global digital rights landscape. Are there common patterns or challenges that you’re observing across countries when it comes to advocacy with big tech?

Elonnai Hickok:
So I think first to start, I would pick up a little bit on what the previous speaker was saying. And, oh, can you hear me? There we go. And say, you know, first there are multiple strategies for engaging with big tech. And we’ve heard some of them on the panel today. There’s strategic litigation, there are petitions, there are investigations, there are rankings, there are organizations that help companies do human rights impact assessments in specific contexts. And maybe just to start out with a little bit of an explanation of GNI, we are a multi-stakeholder platform working towards responsible decision making in the ICT sector with a focus on how companies are responding to government mandates for removal of content and access to user information. Core to the work that we do is an accountability mechanism. We bring together platforms, companies, civil society, academics, and investors. And all members have to commit to the GNI principles with respect to freedom of expression and privacy. And our company members go through an assessment process that looks at how they are implementing those principles in their policies, in their processes. And so this can take many different forms. An assessor comes in and they do an assessment report that looks at these policies, processes, as well as case studies. So how are they actually implementing them in practice? And then this assessment report. is discussed by our multi-stakeholder board, and the measurement that is used is an improvement over time. And I think that improvement over time standard is really important because we’re in a really rapidly evolving digital ecosystem. And so at GNI, we are looking at how companies are adapting and adopting their processes and policies to meet the challenges. Many of what we’ve heard today. And I think in addition to this accountability mechanism, which is not perfect, but it is a multi-stakeholder accountability mechanism that we do try to implement. We also do work around policy, so consensus-based policy advocacy, where we are responding to developments like in Turkey, like in Vietnam, like in Pakistan, and it’s not just authoritarian regimes where we see concerning trends coming out, but also in democratic contexts, we see a number of concerning trends, such as requirements for tight timeframes for removal of content, requiring that there are local offices, so personnel are on the ground, proactive monitoring requirements, data localization, a broad scope of companies being brought under the ambit of different mandates and licensing regimes. And so all of these we respond to through our policy advocacy, and then we also do a lot of learning, trying to understand really the difficulties that go into operating in these contexts. It’s not simple, it’s not a black and white, yes, you can push back immediately. I think there’s a lot of gray areas, and so we have conversations about how companies can navigate these spaces to protect human rights. while they’re operating in these jurisdictions.

Sarah Clarke:
Thanks, Elonnai. And have you seen any unique approaches to overcoming the challenges that previous panelists have discussed? And do you have any insights on how advocates in different regions can learn from each other?

Elonnai Hickok:
I mean, so like I said, I think there are a number of different approaches that different civil society organizations are taking. From my perspective, I think there is a need to coordinate amongst civil society organizations to make sure that our asks to companies are not so fragmented. Our asks to governments are not so fragmented that they’re not impactful, while still maintaining our unique position in the ecosystem. I think there is also a need for capacity building of civil society organizations to engage with tech companies. There is a very clear power dynamic between civil society and tech companies. Often, it is the tech companies that have the information. Obviously, they’re implementing, and they’re working in these markets. And so I think for civil society to navigate that and actually be impactful, there’s a lot of capacity building that needs to happen. GNI has worked to develop different tools to help civil society engage with the tech sector. For example, we collaborated with BSR to develop a tool called Mapping Human Rights Due Diligence Across the Stack. And there, it’s taking an understanding of really, for a rights-respecting ecosystem, we need to be looking at human rights due diligence considerations across companies in the tech stack, asking the right questions, but also understanding the role that different companies play. So a social media company, when you look at freedom of expression, you might be asking questions. questions about their community guidelines. Are they in line with international human rights standards? But if you’re looking at a telecom company, you’re gonna be really focusing on how their policies and processes are in place to respond to government requests for access to user data or network shutdowns. If it’s a cloud service provider, it’s a very different question. So I think it’s really important to take an ecosystem approach to our understanding of advocacy efforts and the specific questions that we’re asking of companies. And then we’ve also developed a tool with GPD that tries to just provide basic guidance to civil societies to engage with tech companies starting from understanding the company. There’s a lot of different arms within a company that might be important to engage with, like the legal department, like trust and safety, like policy, sometimes policy is known as the PR, you know, the PR arm of a company. So how do you get past that? Then, you know, do you understand the legal environment that the company is operating in so you can understand what restraints they might be working with and then tailor your ask to that? And then are you trying to come with a very specific constructive ask? And then I would say just another thing that we try to do is we are collaborating with BrainBox to run something called the Action Coalition for Meaningful Transparency, which is trying to help coordinate the space around transparency. It brings together companies, civil society, academics, and just maps and coordinates the space. And so I think there’s a lot of different ways to help coordinate civil society input and advocacy with companies, but also on these regulations. that are coming out with restrictive provisions.

Sarah Clarke:
We’re going to just move to a slightly interactive part of the event, which was one of the asks from the organisers. I know you are all at the end of a long day, so we’re going to just have a quick trial of we’re going to just break into two groups. We’re going to start with some Q&As first. Does anyone have questions in the audience before we break out to the panel? I know we have a couple of questions online. Yes. Could we get a mic? Yes. Thanks very much. Yeah, come on up. Thank you.

Audience:
Hi, everyone. Oh, yeah. It’s on. Hi. Thank you so much for your presentations. My question is about META specifically. So I think one of the main difficulties that civil society organisations have engaging with big tech companies is the people working there and how long they’re present in the space. We have worked with through so many different layoffs inside of META, where different groups and different specific people working with different human rights delegations, they work, they’re not there anymore. And we saw that a lot with the recent Brazilian election in 2022, Brazil went through elections. There was a team working on disinformation. And by the time that the 8th of January attacks happened, they weren’t there anymore. It was a different group of people. My question is also about context at a certain extent. We’ve had multiple occasions as well, where people working into a certain issue. and looking at specific problems, especially around content moderation, they weren’t fluent in the language that they’re working in and they lacked certain types of context as well. So as civil society organizations, as civil society members as well, how can we best engage, how can we best prove our value and add value as well and just engage in a better online ecosystem with big techs when there are those barriers and those barriers seem to not be a priority inside of the organizations as well. Thank you.

Sarah Clarke:
Thanks Chata, that one’s directly for you. Just before Chata answers, is there any any other questions? There’s one online, so we’ll start with that one and then we’ll go to you Alexander and we can, we’ll take another one online. Okay, for

Cagatay Pekyorur:
the first question that you raised, I think the only thing that I can say, I can share my email address after this with you and like I will be happy to try to find the right person that you can engage with. I understand that the person that you were engaging with is not with us anymore after layoffs, but I’m sure like we can find people that can be, that can continue the process that was there. And about the content moderation question and like how actually you can engage in the most efficient way, I would recommend you to check our Trusted Partner Program if you have heard of it or if you haven’t, are you smiling? I’m not. You are part of the Trusted Partner Program, then clearly you are in the most efficient area to be effective in that. But again like if there are specific issues, like maybe we can discuss after this panel and I may try to support, like in some cases there might be some needs of like additional training. or like having a direct conversation with the team that is responsible for the program to increase the engagement between civil society and us. And like, I would be happy to facilitate that.

Sarah Clarke:
Thanks, Chate.

Audience:
Good evening, Alexander Savnin from Civil Society Russian Federation. So maybe we are not so in Asian part, but situation with big platforms like Meta and Google went in Russia far further than you explained. Meta is now a terrorist organization and Facebook completely blocked, Instagram completely blocked, Twitter completely blocked or something like. So, but civil society still exists in Russian Federation. There’s still people leaving, people still using Facebook and even Facebook, sorry, WhatsApp, even being a part of terrorist in Russian Meta shows signs of cooperation with government and something like. So my question may be more broad, what we have to do if we fell in much problematic situations that was explained in Turkish and Vietnamese example. Maybe not only to Meta, but for policy person.

Sarah Clarke:
Yeah. Sorry, was the question what civil society should do in. Russian civil society, which is already blocked. And well, maybe ask the stakeholders other than government.

Elonnai Hickok:
I mean, I think it would be important, for example, to engage in multi-stakeholder initiatives where you can share your experience and then engage with companies to talk about what is happening. on the ground, I think, for example, we have a working group on armed conflict and responsible company decision making in armed conflict times. And there we try to discuss how can companies navigate operating in times of crises, in conflict zones, and bringing that perspective that Russian civil society might have, I think, would be very valuable. So that might be one approach.

Sarah Clarke:
Thanks, Eleni. And a question now for Long online. So Long, in the context of, from a Global South perspective, which is increasingly moving towards the securitization of internet regulation and in the face of overbroad and vague legislation, and such as that in Vietnam, what does the policy of platform compliance look like? And what would you, so this is also for you, Chata, what considerations are taken into account in cases such as takedown orders? I think you’ve talked about that in the context of Turkey, but maybe Long, if you’re still there.

Trinh Huu Long:
Yes, thank you for the question. I think that’s all the experts have laid out all kind of recommendations and policies, recommendations for big tech company. I’m not in a good position to repeat these, all of these good recommendations. I just have one request for big tech companies in terms of content removal. Please be more transparent in terms of what kind of requests you are receiving from authoritarian governments and publicize these. requests so the public can see. Google has done a little bit by publicizing some requests from the Vietnamese governments, and we know clearly that these requests are to take down criticism against the governments, and using all kinds of vague and broadly defined laws and regulations. But then just a few, the Vietnamese government have sent tens of thousands of requests to Facebook and Google only. So we need to know what it is, and users need a fair platform, a fair procedure to make an appeal. For now, platforms are playing safe. They don’t want to be seen as committing any illegal acts in Vietnam. So they have all kinds of robots and AI playing very safe, and if they suspect anything, they will take down immediately until users make an appeal. But an appealing process takes a long time, right? And many appealing appeal requests are ignored by platforms. So we just think that being more transparent is the answer to solve a lot of things. And I want to see all kinds of government requests from Vietnam. And of course, there are legitimate reasons to not publicize some of them, all right? But most of them should be publicized. That’s my take. Thank you.

Sarah Clarke:
We have another question in the room, if you’d like to come to the mic. And if you could say who your question’s for, that would be really helpful.

Audience:
Thank you very much. My name is Vadim, and I’m also from the Russian Federation Civil Society, you won’t believe it. And this is just, well, not really a question, but a small clarification. to what Alexander said. Well, if we go one year back talking about this situation with the Meta, Facebook, and Instagram in Russia, we should recall when it started. It started from the point when Meta allowed hate speech against Russians on its platforms in Russia. And that was the reason for the actions of the government. And, well, it is understandable because no one wants if any other person calls those users of the platform to, well, kill other people. So, the question of Alexander was what should civil society do? It’s not a question to the civil society, it’s a question to the platforms. What should platforms do? The platforms should follow the rules, the regulations, and the laws of the country where they operate. Thank you.

Sarah Clarke:
Thank you. We have another question online, another one for you, Chata, I’m afraid. And this time it’s about what do you do where one state weaponizes the internet against other states and where you have digital interference and incitement to violence and disinformation campaigns from abroad. So, what can be done in situations like that to avoid internet fragmentation?

Cagatay Pekyorur:
Okay. I will try to give my best answer, but I’ll be frank. I’m taking these questions a little bit out of the scope of this panel because in my view, here we are focused on authoritarian governments, how we are managing their requests. But the question is. I think it’s very important for us to be aware of the fact that there is a lot of organizations talking about, like, how we are managing misinformation, disinformation, and it doesn’t sound like they are talking from the perspective that when they are done by the authoritarian governments. But I did focus on, like, I mentioned, like, our Salem human rights risk, and I did focus on two rights there, freedom of opinion and expression, and also right to the freedom of expression, and also, like, the right to expression, and also, like, the right to freedom of expression, and also, like, liberty and security of person, which relates to these issues that’s mentioned there. Incitement of violence is something that we do care deeply and take action immediately in many different cases, for example. And we are doing it from the perspective of, like, we don’t want to be a victim of disinformation, but we do want to be a victim of disinformation, and we do want to be able to protect people’s life and security. And when it comes to these disinformation operations that may come from abroad, actually, when we talk about the authoritarian regime’s context, we also do see this type of disinformation coordinated efforts from the country, not only from the government, but also from the local sources. And we do publish reports on this one. Again, this one, actually, our report. And you can see many instances in several countries, governments are behind in some of these networks. And regardless of the source, like, if it’s local or coming from outside of the country, we share the details of our findings with the general public, and we have teams specifically working on finding these networks, and then, like, share our findings with the general public to create some transparency on this, and obviously, when they find the network, they take an action. Let me see if I missed part of the question. When it comes to collaborating with competent authorities, I think when I was saying overbroad demands of the governments, I also was trying to imply if the demand is not overbroad, we are complying to it. When it’s proportionate, when it’s within the lines of international human rights principles, we do comply and provide the user data or take down the content, jail block the content. So in the cases like when there is a crime that might be involved, if the local law is requiring us to share the user data and if also the request is aligned with international principles, of human rights, we do actually provide. So I think we do it. Thank you. Thanks Cagatay.

Sarah Clarke:
So we’re going to have a little bit of time at the end for questions, but we’re going to have a quick breakout group if you still have a little bit of energy left. So we’re just going to divide the room in half. So if people on this side could come up with our colleague Kivuljam here. We’ve just got two questions and people on this side will go down to the back with my colleague Joanna who’s at the back. So the questions are… We’re just coming up now. So what we really want to hear about is what are the most pressing challenges that you’ve encountered or observed when advocating for digital rights in your country? region? And how can we fight back? So what international platforms or mechanisms can be leveraged to expose and challenge internet restrictions? So we’re just going to put these questions up here, but it’s really, we want to hear from you guys what you’re experiencing in your countries in relation to challenges that you’ve encountered when dealing with digital rights and in your advocacy. And the second is how can tech companies best fight back? So how can you leverage together with tech companies? So if anyone who’s interested to participate can just come up here on this side with Kiviljim and on that side at the back with with Joanna. So the first question is really what sort of challenges are you facing in advocating on digital rights when it comes to your work with tech companies? And then we’re talking about strategies to fight back. So anyone on this side of the room, if you want to come up to the front, we’ve got a little group here with Kiviljim and anyone on this side can go at the back with my colleague Joanna. So Kiviljim, just one minute, yeah great. Okay, so we’re just going to have a little feedback and then any final questions. So Joana and Kivljim and online. Great. Super. Okay. Is she ready now? Okay. Okay, guys, so we’re going to have our first feedback from our colleague, Joana. Thank you. Yeah, it works. Okay, guys, just if we just quieten down. We’re thrilled you’re all very engaged, but just let’s have a little bit of… So in our discussion, it turned out that even if we are from different regions, we have similar challenges as civil society. So what was mentioned was, for example, that… Okay, guys, we really do need some quiet. So we’re just going to listen to Joana.

Audience:
One example was that negotiations between social media platforms and governments are mostly hidden. So there is no transparency and we don’t know what they are talking about. And what we need is that… that no such hidden meetings with governments should take place. That it all needs to be transparent and civil society needs to be informed about the results of such discussions. Another thing is that it happens that country level people at social media platforms, those who talk to civil society, those who have contacts often in a very repressive environment with civil society, with activists, they are often the same people who then engage with the government and it leads to concerns from some of the activists about speaking freely with these representatives of social media platforms and big tech companies more broadly because sometimes it happens that they seem to be even close to people from the government. And there are issues with engagement with civil society that are still the same for many years so when companies talk to civil society it’s rather ad hoc, it’s one-sided and it’s extractive and there is no transparency what these tech companies do with the information they receive and it can be really risky especially when we are talking about repressive regimes. We also talked about trusted… partner issues, and that there’s been a report published about talking to, as a result of talking to 24 trusted partners that shared their frustrations with that process, especially on response times or even no responses at all. So the conclusion of our discussion was that these systems, this mechanism of engagement with civil society that are in place right now, they don’t work the way we would like them to be, and there is no collective process to make sure they are designed properly. And then the next question, we discussed that there was one idea that the discussion on repressive regimes perhaps should be framed a bit differently, like that we are discussing the power of big tech companies, because if we frame it as just limited to repressive regimes, then we risk countries that are not repressive, that they will be out of the discussion and they will not engage on this discussion. And in general, we need more active engagement from tech companies to resolve the issues on issues such as content moderation and lack of transparency, because there is also the feeling of civil society being a bit tired of lack of progress over the years, that there is a feeling that we keep having similar meetings, similar discussions, but there is little progress. on the side of tech companies. Thank you.

Sarah Clarke:
Thank you so much, Joanna. We’re just going to go online now to Kasia. Yes, hello. Hello there. Good evening. Hello from Europe. Hello, can you see me? Yes. Can you hear me? We can hear you and see you.

Katia Mierzejewska:
Perfect. Okay, great. Just like a short introduction, I’m Kasia Mierzejewska. I’m part of the team of Europe and Central Asia article 19 and helping today with online moderation. And we also had a very similar discussion online with participants who are joining remotely. So I can only refer to and make a nice segue to what Joanna already said, because during our conversation, it was pronounced repeatedly that civil society, there is a big responsibility and a big gravity of the whole discussion on civil society, whereas civil society has been advocating and creating and showing their main points of advocacy a lot. So that’s now, there should be tech companies who really genuinely engage and address what civil society has been saying this whole time, because there was also mentioned that sometimes it feels like human rights policies established by tech companies feel like a PR tool and big tech should really do something, take the reins in this discussion and lead it, because the civil society have been saying enough. And as Joanna said before, as a conclusion from one of the discussions on site, there has been this sense of weariness and sense of frustration, like how many more times civil society should have been repeating itself. their cause that have been out there very transparently put and the message is very very direct and this is this has been conveyed for over years whereas the progress on the on the side of of the companies of the private sector hasn’t has been little has been little for over this over this last years and just like one more thing maybe two more things that I would add is the approach approach on the on the business model so like when business model is heavily based and focused on data mining and using data for the for the profit for the beneficiaries also when it comes to the advertisement that’s that’s actually is keeps impeding the human human rights policies that has been has been in place and has been established to be to be efficient so there is there is a need to maybe go maybe start to look in the other way and start going away from this approach and think about a different business model that would ensure that the core human rights needs and values and protection of data is being one of the one of the priorities in for for the tech for the tech companies and there is also one one more thing just just a second oh yeah there was also one one more point mentioned which also formulates sort of a question to to the representative of meta here the platforms could use its coordinators in a harmful smear campaign or or disinformation and how to react in such situations how taking down the posting because what what has been mentioned also online is that there is one thing about content moderation and shutting down content from the individuals, just individuals, internet users, but what should be and what are the policies in place and mechanism to ensure that orchestrated and led by the government propaganda and spreading this information is being addressed as well. And we’re asking here for any thoughts or feedback from the platforms. So that’s to summarize our discussion online. Thank you.

Sarah Clarke:
Thank you, Katia. And thanks to everyone who participated online and for those really cogent points. And we’ll have Chato maybe respond on some of those. And then just finally, Kivilcim is just going to report back. Kivilcimis our program assistant at Article 19 on this group here. Oh, hello. So in our group, the conversation was very, very interesting,

Kivilcim Ceren Buken:
actually. So the first point that came up was that the participants in our group have observed that platforms seem to respond much better when requests are shown to them as a business opportunity. For example, there was an example about Brazil. There was a time when Twitter was blocked in Brazil, and there was a big sports event happening. Because Twitter was blocked during this big sports event, Twitter actually lost a lot of money. And that is when they realized that they need to hire some Portuguese speakers. And that was the first time that they hired Portuguese speakers, is what one of our participants said. Another example was that we had a participant who works for Tor. They said Facebook used to hate Tor, because their business model is based on watching what people… do on their platforms, but then when Iran blocked Facebook and people were using Tor to access Facebook, Facebook started thinking better of Tor, so what was being said was our participants have found that instead of saying human rights and do this for human rights, we could actually say this is good for your business, you could make money out of this. That is a way that they found that works well with the platforms. It’s a bit sad that I said what everyone said, no self-censorship. The second issue that came up was there was a participant in the group who used to work on reporting where bombings happened in Syria. So what they did, they would look at the videos on YouTube because people, when there are bombings in Syria, people record those bombings and then they put them on YouTube and then that is really good evidence to know where there was a bombing, who did it, what happened. So what happens is the AI just removes these videos because it’s violence, but that is actually really good evidence that is being lost forever. So what our participants said was, well of course they can remove it because it’s bad for their users, they should remove it, but can they keep it somewhere at least for a court case for when it’s needed. So these were the two big issues that came up with our group.

Sarah Clarke:
Thanks Kivilcim. So I wonder if panelists would like to respond to any of the points that were raised. I know we’re well over time, but chat, hi, hello, no. Yeah.

Kivilcim Ceren Buken:
instead of like responding to them one by one, I actually want to share some general reflections on them. None of the things that like we did here in these comments are new to me. And I think, and that’s my personal view, the challenge that we try to discuss here, it is slightly different than the general challenges that civil society may see in relation to platforms. On content moderation, on like business interests, like how to better engage with civil society, definitely part of what we are trying to solve. But I think for specifically this one, civil society and platforms should think together rather than saying like, this is something that you need to deal, this is something, no, you have to deal by yourself.

Cagatay Pekyorur:
Because I don’t think that none of us have enough power to show in kind of a meaningful change when we are faced with an authoritarian regime. And the problem is not that small. We do see lots of big wave of actually legislative developments, which will help these governments to censor content or which will force platforms to share user data. They are most of the times come with the form of cybercrime legislations, sometimes targeting specific groups, LGBTQ groups, like we do see lots of legislations on that, sometimes more general. And I think if we are concerned about human rights risks that may arise from these legislations, I think we… should specifically focus on that topic, and specifically focus on how we can resolve it together.

Suay Ergin-Boulougouris:
OK. Not as a response to your contribution, but I actually wanted to echo what Katya said, or report it back from the online group. And because I was, as we came up with this workshop idea after the censorship during the elections in Turkey, I was thinking as I was preparing for the workshop, but what is the solution? And we don’t need to actually reinvent the wheel. All the, as it has been discussed in the groups as well, I think the calls that the civil society have been making for a while now, if these were addressed, I think at least we wouldn’t be dealing with such big challenges, maybe. I’m not sure, because I still think the core of the issue is the lack of rule of law, and so on. But I agree, especially I agree with you that in restrictive regimes, civil society and big tech may work together, but it is still, because you are basically caught between a rock and hard place, like in Turkey, for example. But it would not impede, I think. I think civil society’s obligation to also continue monitoring and keeping you also accountable for, not you, but the big tech practices in these countries.

Sarah Clarke:
Thanks Suay. Elonnai, any final remarks from you?

Elonnai Hickok:
I mean, just very quickly after listening to the discussions from the breakout rooms, I would just come back to the point that I made earlier about the importance of coordination and finding ways. I don’t know if it’s around coordinating on topic, on approach, on region, on the asks, on the best practices, but how can we as civil society start to be more coordinated? And I think it’s important not just to be effective with our asks for companies and for regulators, but also just to stay on top of this space because it’s evolving incredibly rapidly and I don’t think it works for us to be working in silos and trying to respond in an ad hoc manner.

Sarah Clarke:
Thank you, Elonnai. And Long, I know that we’ve been told that our time is absolutely up. Oh, Kojo, would you? Very brief. We’re going to be cut off.

Audience:
Sorry. I’ll try and keep it super brief. Hi. Evening, everyone. My name is Kojo Bwachi. I’m the Vice President of Public Policy for Africa, the Middle East, and Turkey with META. I’m slightly surprised and a little bit disappointed by some of what I’ve heard, not because it might be deemed inaccurate, because I’m disappointed if one thing I heard, your contact with us has suffered after layoffs, like you don’t have any people to speak to when you turn around. Another thing is that we… Sorry? How are you surprised by that? You weren’t there. I didn’t say surprised. I said disappointed. I think if you if you lay off 21,000 people there’s going to be issues so it’s not a surprise but I am disappointed. I’m also slightly disappointed by the consistent talk about meta prioritizing money over everything else which I think I’ve heard not only in this room but in a number of other rooms. Part of my disappointment stems from the fact that it’s not a characteristic or action of a company that I recognize and I’ve worked here for seven years and that’s not to dispute that what you guys have said. I just wanted to ask if anyone knows a country in my region, Amit, that is blocked at the moment where the internet is blocked and I say this because I know no one from that country is in this room at this point in time. Does anyone know? No. In Uganda the internet or at least meta remains blocked because we divulged the fact that we disrupted the network run by the Ugandan, at least the digital office within the Ugandan government. That’s a country where we’ve made telling investments not only in the provision of services but also the establishment of backbone open access infrastructure as well. It’s in a country that remains, was and remains extremely important to us. So where as we’ve all things where we may have had some failings I would like to think we want to work with you to correct those but I do want to say that at least in my region I have responsibility for and I hope much of the world where meta works that these ideas that revenue is a priority over all the human rights that we try to protect I’d like to kind of push back on slightly and also say that it’s hard to push back on in this room so I’m open to a conversation with anybody after this meeting or in continuation. I also think that the fact that we have 20 people here including our most senior public policy person Nick Clegg is some evidence of our willingness to engage and be beaten about a little bit, but just to learn from you guys as well. And this idea of continuing to work for us to improve and get better and to work with civil society is something that we hold very, very dear. So don’t wanna take too much time. I know I should be short, but open to a conversation to learn and hopefully improve and do better if we need to. So thanks so much.

Sarah Clarke:
Thanks Kuju. Long, I don’t know if you’re still there, but if you have any final remarks before we finish, I know we’re.

Trinh Huu Long:
Yeah, thank you. I am very grateful. So I think that civil society organizations from Vietnam, I am confident we will keep working and collaborating with the tech companies and other partners to address all the fundamental issues we have talked about. And we will keep fighting common ground, common interest between civil society and private sector and work something out. But also we have done everything we could. We have said everything we have. And it’s time for big tech companies to really do something about it. And it is your turn now. We have sacrificed too much over the past 10, 15 years. Thank you.

Sarah Clarke:
Thank you very much, Long. So just to quickly conclude, thanks to you, the audience for staying so long and for engaging so passionately and brilliantly in the discussion. And a huge thanks to our panelists, to Chia Tai, to Sui, to Long and to Eleni, to Kasia and to our colleagues online and to the wonderful tech and support for going almost half an hour over. We really appreciate it. appreciate it. Have a great rest of your IGF and please come and talk to us about all the things we’ve started talking about today.

Katia Mierzejewska

Speech speed

148 words per minute

Speech length

608 words

Speech time

246 secs

Audience

Speech speed

147 words per minute

Speech length

1666 words

Speech time

680 secs

Cagatay Pekyorur

Speech speed

155 words per minute

Speech length

2813 words

Speech time

1091 secs

Elonnai Hickok

Speech speed

156 words per minute

Speech length

1418 words

Speech time

544 secs

Kivilcim Ceren Buken

Speech speed

153 words per minute

Speech length

548 words

Speech time

216 secs

Sarah Clarke

Speech speed

146 words per minute

Speech length

2273 words

Speech time

936 secs

Suay Ergin-Boulougouris

Speech speed

151 words per minute

Speech length

1243 words

Speech time

495 secs

Trinh Huu Long

Speech speed

127 words per minute

Speech length

2205 words

Speech time

1045 secs