Balancing act: advocacy with big tech in restrictive regimes | IGF 2023

9 Oct 2023 08:00h - 09:30h UTC

Event report

Speakers and Moderators

Speakers:
  • Suay Ergin-Boulougouris, Civil Society, Western European and Others Group (WEOG)
  • Elonnai Hickok, Civil Society, Western European and Others Group (WEOG)
  • Trinh Huu Long, Civil Society, Asia-Pacific Group
  • Cagatay Pekyorur, Private Sector, Western European and Others Group (WEOG)
Moderators:
  • Sarah Clarke, Civil Society, Western European and Others Group (WEOG)

Table of contents

Disclaimer: This is not an official record of the IGF session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed. The official record of the session can be found on the IGF's official website.

Knowledge Graph of Debate

Session report

Elonnai Hickok

The Global Network Initiative (GNI) is a multi-stakeholder platform that aims to promote responsible decision making in the ICT sector. It focuses on how companies respond to government mandates for content removal and access to user information. GNI advocates for strategies such as strategic litigation, petitions, investigations, rankings, and human rights impact assessments to engage with big tech.

One concerning trend highlighted by GNI is the increasing number of government mandates that impose requirements for quick content removal, local offices, proactive monitoring, data localisation, and broad company scope. GNI’s policy advocacy responds to these developments in countries like Turkey, Vietnam, and Pakistan.

GNI recognises the importance of companies adopting and adapting their policies and processes to meet the challenges of the rapidly evolving digital ecosystem. To ensure accountability, GNI assesses the implementation of its principles in the policies and processes of its member companies. The organisation uses an “improvement over time” standard to measure their progress.

Operating in different jurisdictions is complex, and GNI strives to understand the difficulties faced by companies in various contexts. It helps companies navigate these spaces while striving to protect human rights.

There is a power dynamic between civil society organisations and tech companies, where the latter often hold most of the power and information. GNI has developed a range of tools to assist civil society in engaging effectively with the tech sector. It is running an Action Coalition for Meaningful Transparency to support civil society’s advocacy efforts. Different tech companies call for a focus on different aspects, such as privacy for telecom companies and human rights-respecting community guidelines for social media companies.

To engage with tech companies effectively, civil society needs to understand the ecosystem approach and ask specific, pointed questions. It is crucial to consider the legal environment in which a company operates and engage with relevant departments, such as the legal department, trust and safety, and policy divisions.

Elonnai Hickok, Public Policy Director at GNI, highlights the restrictive provisions in regulations being introduced by countries. However, specific examples or additional information regarding these regulations were not provided.

GNI encourages Russian civil society to engage in multi-stakeholder initiatives in order to share their experiences and engage with companies. This will help foster dialogue and create a better understanding of the ground realities.

In conclusion, the GNI plays a crucial role in bridging the gap between big tech companies and civil society organisations. It promotes responsible decision making in the ICT sector, advocates for human rights, and provides tools and initiatives to assist civil society in engaging effectively with the tech sector. By coordinating their efforts and working collaboratively, civil society stakeholders can have a greater impact in shaping the digital ecosystem and protecting human rights.

Cagatay Pekyorur

META, a company, has been implementing a human rights corporate policy since March 2021. This policy is guided by United Nations (UN) principles and JNI commitments. META demonstrates its commitment to transparency and accountability by publishing annual human rights reports. These reports highlight two significant risks identified by META. The first risk is overbroad government requests on take-down, where governments ask for the removal of content that goes beyond what is necessary or justified. The second risk is overbroad or unnecessary government demands for user data, where governments request access to user data that exceeds what is proportionate or required.

To address these risks, META has developed a comprehensive process to assess government take-down requests. This process involves evaluating the legality, legitimacy, necessity, proportionality, and potential external risks associated with the request. META considers factors such as legal review, international human rights standards, and the potential risks to users’ safety and offline harm. By carefully assessing these factors, META aims to ensure that its responses to government take-down requests align with human rights principles and protect the interests of its users.

There have been concerns over amendments made to Turkish legislation, which have expanded the scope and severity of sanctions against non-compliance with government requests. Companies that fail to comply with a single take-down request or do not share user data may face significant consequences, such as blockage or throttling of their services. Previously, non-compliance resulted in monetary fines. These amendments have raised concerns about censorship, privacy, and freedom of expression.

Civil society plays a vital role in holding companies accountable and influencing public opinion against government actions that may infringe on human rights. They act as advocates for people’s rights and serve as intermediaries between government policies and users’ fundamental rights. Dialogues and collaborations between civil society, platforms, and companies are essential for understanding the regulatory framework and establishing expectations.

Rather than being in opposition, collaboration between companies and civil societies is encouraged to work together. This partnership can promote more effective engagement and address challenges collectively. Cagatay Pekyorur, representing META, expresses willingness to facilitate increased engagement between civil society and their team.

META places significant importance on managing misinformation and disinformation. They have dedicated teams working diligently to identify and address networks that spread false information. META also shares their findings with the general public to promote transparency and raise awareness about these issues.

META promptly takes action in instances of incitement of violence appearing on their platform. They emphasize the importance of freedom of opinion and expression, as well as the right to security. META supports the need to take action when such instances occur to maintain a safe and responsible online environment.

When a government’s demand is not overbroad and aligns with international human rights principles, META complies with requests for user data sharing or content removal as required by local law. This demonstrates their commitment to navigating the complex balance between legal obligations and protecting user rights.

Legislative developments, particularly in the form of cybercrime laws, can present challenges for companies like META. These laws may empower governments to censor content or force platforms to share user data. Such legislation may also target specific groups, such as LGBTQ individuals, further raising concerns about human rights violations.

Addressing challenges posed by an authoritarian regime requires collective effort. Civil society, digital platforms, and companies should collaborate instead of dictating each other’s responsibilities. This collective approach enables a more comprehensive response to protect human rights and challenge abuses of power.

In conclusion, META has implemented a human rights corporate policy guided by UN principles and JNI commitments. They have established a rigorous process to assess government take-down requests and mitigate risks. Concerns have been raised about amendments to Turkish legislation that expand sanctions for non-compliance. Civil society plays a crucial role in holding companies accountable and collaborating with platforms. META is committed to managing misinformation and taking action against incitement of violence. They comply with government demands aligned with international human rights principles. Legislative developments pose challenges, and addressing them requires collective effort.

Sarah Clarke

The analysis delves into Sarah Clarke’s discussion on the significance of advocating for digital rights in hostile environments, wherein big tech companies face pressure from authoritarian regimes. Clarke underscores recent instances in Turkey and Vietnam as examples, where popular platforms like Twitter, Meta (formerly known as Facebook), and YouTube found themselves entangled in a predicament due to government-imposed content blocking. These incidents shed light on the complex challenges faced by these tech companies as they grapple with the dilemma of either complying with restrictive orders or risking potential sanctions.

In Turkey and Vietnam, the bans on platforms and content throttling imposed by the respective governments have put companies such as Twitter, Meta, and YouTube in a difficult position. By complying with the orders, these companies risk being complicit in the suppression of freedom of expression and information. Conversely, defying the restrictive measures could result in severe penalties and sanctions imposed by the authoritarian regimes. Clarke underscores the gravity of this predicament, emphasising the arduous task faced by these tech giants in navigating such scenarios.

Furthermore, the analysis highlights Clarke’s endeavours to encourage discussion and seek solutions to address the challenges faced by digital rights advocates in restrictive regimes. As part of this effort, Clarke has organised a panel discussion involving stakeholders from different regions. The primary objective of the panel is to foster actionable strategies that can effectively tackle the issues at hand. The analysis reveals that the panel discussion includes interactive elements such as workshops and breakout sessions, which promote collaboration and enable fruitful exchanges of ideas and experiences. Through these interactions, Clarke aims to create a platform where participants can explore innovative ways of resolving the complex challenges associated with advocating for digital rights in hostile environments.

The sentiment towards the importance of digital rights advocacy in hostile environments is predominantly negative, as captured by the gravity and complexity of the situation. The analysis suggests that the challenges faced by big tech companies and the potential suppression of freedom of expression reflect the severity of the issue. However, the overall sentiment regarding Clarke’s efforts to encourage discussion and find solutions is positive. The panel discussion she has organised serves as a platform for stakeholders to come together, share knowledge, and collaborate towards addressing the challenges faced by digital rights advocates. This signifies the recognition of the need for collective action and partnerships to effectively tackle these issues.

In conclusion, Sarah Clarke’s discussion on advocating for digital rights in hostile environments sheds light on the complex challenges faced by big tech companies pressured by authoritarian regimes. The instances in Turkey and Vietnam illustrate the difficult decisions these companies must make, as compliance entails potential complicity in suppressing freedom of expression, whereas defiance may result in severe penalties. Clarke’s efforts to encourage discussion and seek solutions through the panel discussion with various stakeholders highlight the importance of collaborative efforts and actionable strategies in addressing the challenges faced by digital rights advocates. Although the sentiment towards the issue is negative due to its gravity and complexity, the sentiment regarding Clarke’s initiatives is positive, indicating the recognition of the significance of these discussions in finding solutions.

Kivilcim Ceren Buken

The analysis focuses on the relationship between online platforms and civil society, exploring the challenges related to human rights, content moderation, and business interests. One argument presented is that platforms are more responsive when requests are presented as potential business opportunities. For instance, an example is provided where Twitter hired Portuguese speakers after realizing the potential business loss during a sports event blockade in Brazil. Similarly, a change in Facebook’s attitude towards Tor when it was blocked in Iran and users accessed Facebook via Tor demonstrates this point.

Another argument highlights the negative consequences of AI-driven content moderation, particularly the removal of violent content that leads to the loss of valuable evidence. The analysis provides an example of an individual tracking bombings in Syria through YouTube videos, which are often removed due to violent content. As a solution, it is suggested to store the removed content for potential use in court cases.

Advocating from a business perspective is considered effective with online platforms, rather than solely focusing on human rights. This approach is supported by examples from Brazil and Iran, where platforms’ success in responding to business opportunities is discussed.

The analysis stresses the importance of collaboration between civil society and online platforms in addressing the various challenges faced. It acknowledges that the challenges encountered by civil society in relation to platforms are not a new phenomenon. Content moderation issues and conflicting business interests are recognized as significant challenges that need to be addressed. Therefore, it argues for civil society and platforms to work together in finding collaborative solutions.

However, it points out that platforms should not bear the burden of addressing challenges related to civil society independently. Collective action is seen as essential, emphasizing the need for collaboration and partnerships between platforms and civil society.

The analysis provides valuable insights regarding the relationship between online platforms and civil society. It showcases the importance of advocating from a business perspective to effectively engage platforms and highlights the potential loss of valuable evidence through AI-driven content moderation. The call for collaboration between civil society and platforms and the significance of collective action in addressing challenges are noteworthy observations. Overall, the analysis underscores the need for a balanced and cooperative approach to ensure the rights and interests of all stakeholders are respected in the online platform ecosystem.

Katia Mierzejewska

Civil society has long been advocating for improved human rights policies from tech companies, expressing their concerns over the years. However, there has been minimal progress on the part of tech companies, leading to frustration and weariness within civil society. This negative sentiment arises from the perceived lack of change and progress in the realm of human rights policies.

One of the main arguments put forth by civil society is that the human rights policies established by tech companies often feel like a PR tool. These policies, according to civil society, fail to effectively address the issues at hand and make a tangible impact. Consequently, civil society is calling for more action and substantive change from tech companies.

Another crucial argument highlights the need for a shift in the business models employed by tech companies. The current focus on data mining and using data for profit is seen as impeding the effectiveness of human rights policies. Civil society advocates for a shift towards business models that prioritize human rights and data protection over profit-making.

Moreover, there is growing concern about how tech platforms can address government-led propaganda and disinformation campaigns. The question arises as to how platforms can effectively react to orchestrated smear campaigns or disinformation potentially backed by governments. This observation further emphasizes the necessity for tech companies to actively address and mitigate the impact of such campaigns.

In conclusion, there is a clear consensus within civil society that tech companies need to demonstrate more significant progress and take a leading role in addressing human rights policies. The current perception is that these policies are inadequate and merely serve as a public relations tool. Additionally, a shift in business models, prioritizing human rights and data protection over data mining and profit-making, is deemed necessary. Lastly, there are concerns about how tech platforms can effectively respond to government-led propaganda and disinformation campaigns. Overall, the sentiment towards tech companies’ handling of human rights issues is negative.

Audience

In discussions surrounding tech companies like META, several key issues have arisen. One major problem is the instability of teams within these companies. In the case of META, the rapid turnover of staff can pose difficulties for civil society organisations striving to engage with them, hindering effective collaboration between tech companies and civil society.

Another significant issue is the language and context barriers in content moderation. The diverse nature of online content necessitates proper moderation to combat disinformation, hate speech, and other harmful material. However, these language and context differences often impede accurate identification and handling of such content.

The situation in Russia regarding big platforms like Meta and Google has escalated beyond previous explanations. Meta is now officially designated a terrorist organization in Russia, resulting in substantial blocks on access to Facebook, Instagram, and Twitter within the country. Despite these restrictions, civil society in Russia persists, with people continuing to use platforms like Facebook and WhatsApp, which are now part of Meta.

One argument suggests that social media platforms should comply with country-specific laws and regulations. This perspective emphasizes the responsibility of these platforms to abide by the rules of the countries in which they operate. Achieving transparency in negotiations between social media platforms and governments is seen as vital, as the current processes are viewed as lacking in transparency.

Additionally, there is a pressing demand for tech companies to take a more active role in addressing content moderation and transparency issues. The existing mechanisms for engaging with civil society activists are deemed inadequate, and there is a sense of disappointment among civil society regarding the lack of progress made over the years.

To advance the discussion, it is proposed that the focus should shift from repressive regimes to the power and influence of big tech companies. Exclusive framing, which overlooks non-repressive countries, risks narrowing the scope of the discussion and engagement process.

Concerns have been raised about Meta prioritizing revenue over human rights. While these concerns are valid, it is important to acknowledge Meta’s efforts in disrupting a network run by the Ugandan government, showing a level of commitment to human rights. Furthermore, Meta has expressed a willingness to collaborate with civil society organisations to improve and learn. The presence of a senior public policy person within Meta serves as evidence of this intention.

In conclusion, the discussions surrounding tech companies like META underscore several significant challenges and debates. These include team instability, language and context barriers, restrictions on platforms in certain countries, the role and responsibility of social media platforms, the need for improved engagement and transparency, the reframing of discussions around repressive regimes, and concerns about Meta’s focus on revenue over human rights. Despite these challenges, there is hope for progress as Meta demonstrates a willingness to collaborate with civil society and work towards improvement.

Suay Ergin-Boulougouris

The internet regulation landscape in Turkey has become a cause for concern as it poses a threat to freedom of speech. Laws in Turkey have been weaponised against free speech, with social media platforms being increasingly censored over the years. Online platforms have become the primary source of news, making this form of censorship particularly impactful.

Non-compliance with content takedown orders in Turkey now results in heavy penalties for companies. These penalties include fines, bandwidth throttling, and advertising bans. This strict regulatory environment puts additional pressure on companies to comply with content removal requests, which can further restrict freedom of speech.

To address this issue, it is crucial for Big Tech companies to develop contingency strategies that protect access to their platforms during sensitive periods. YouTube, Twitter, and Facebook have resorted to content restriction in order to avoid being blocked during election periods. Additionally, it is important for these companies to actively engage with local NGOs and conduct human rights due diligence before taking any compliance steps.

Civil society also plays a vital role in advocating for digital rights issues and engaging with governments. Coordinating with the government can help effectively communicate the importance of human rights cases. Promoting public literacy on digital rights is another crucial aspect for civil society. Furthermore, international attention and efforts on digital rights can greatly influence platform decisions.

It is argued that authoritarian actions of governments should be subject to international accountability. The lack of rule of law, independent judiciary, and stifled civil society allow authoritarian states to act with impunity. Therefore, it is suggested that digital rights should be a provision in trade agreements to hold governments accountable.

Suay Ergin-Boulougouris believes that solutions to current challenges already exist within the calls from civil society. Addressing the concerns raised by civil society can help alleviate the dire situation. Despite the restrictive regimes, Ergin-Boulougouris contemplates the possibility of civil society and big tech working together. However, it is emphasised that big tech also needs to be held accountable, and civil society should play a role in monitoring their practices.

In conclusion, Turkey’s internet regulation landscape poses a significant threat to freedom of speech. The heavy penalties imposed on companies for non-compliance with content takedown orders further restrict freedom of expression. It is essential for Big Tech companies to develop strategies to protect access during sensitive periods and engage with local NGOs. Civil society must coordinate and advocate for digital rights issues and engage with governments. Furthermore, there is a need for international accountability for authoritarian actions of governments, with digital rights provisions being incorporated into trade agreements. By addressing these issues and working together, civil society and big tech can play a crucial role in safeguarding freedom of speech.

Trinh Huu Long

The Vietnamese government has introduced legislation to exert control over the internet, forcing foreign services to store user data locally. This move has raised concerns about internet freedom and user privacy. Big tech companies, including Facebook and Google, have largely complied with government requests, compromising the rights of internet users.

One of the main concerns is the government’s high success rate in content removal and user data requests. There is a lack of independent checks on government power, allowing for potential abuse. This raises questions about government influence and the protection of civil liberties.

Civil society organizations are working with tech companies to address these concerns and advocate for human rights. They argue that big tech companies should be proactive in upholding digital and human rights, rather than bowing to authoritarian pressure.

Transparency from tech companies regarding requests for content removal is crucial. The public should have access to information about these requests to assess government censorship and its impact on freedom of expression.

Additionally, there is a need for a fair and efficient appeals process for users affected by content removal decisions. The current process is inadequate, with appeals often being ignored.

In conclusion, big tech companies need to be more accountable and proactive in upholding human rights and protecting internet freedom. Collaborative efforts between civil society organizations and tech companies have shown positive results. However, there is still work to be done to safeguard user rights in the face of increasing government control and censorship. It is crucial for big tech companies to take action and play their part in protecting user rights.

Speakers

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more