Protecting the Vulnerable Online

17 Jan 2024 13:15h - 14:00h

Event report

More than half of internet users worldwide aged 13 and older face at least one potential online threat. Cyberbullying, online harassment, hate speech and misinformation pose significant challenges.

What are the key developments in trust and safety online and what measures should stakeholders embrace to foster a safer internet for all?

More info @ WEF 2024.

Table of contents

Disclaimer: This is not an official record of the WEF session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed. The official record of the session can be found on the WEF YouTube channel.

Full session report

Maurice Lévy

The discussions focus on the vulnerability and risks individuals face when engaging online. It is widely accepted that everyone is susceptible to these risks, impacting corporations, children, and average internet users. The sentiment expressed in these discussions is primarily negative, highlighting the potential dangers of the online world.

Another concern is the complexity of the online environment and the risks associated with navigating it. The analogy of crossing a street in New York or Paris is used to emphasize that the dangers online are perceived to be even greater. The sentiment remains negative, suggesting that the online world is filled with risks that endanger everyone's safety.

There is also an acknowledgment of the challenges in regulating online content, primarily due to cultural disparities in understanding freedom of speech. Freedom of speech varies by country, with the United States allowing broader liberties, including calls for terror, compared to France. The sentiment continues to be negative, highlighting the difficulties faced in regulating online content due to differences in cultural understanding and perceptions of freedom of speech.

The emphasis is placed on the importance of a safe online environment for advertisers, with discussions highlighting the actions taken by top global agencies and advertisers to implement strong rules. These rules aim to prevent ads from being placed alongside violent or terror-related content. Positive sentiment is expressed in this regard, as advertisers strive to ensure their ads are associated with acceptable content for all audiences.

The role of human beings in shaping the digital internet platform is also addressed. It is noted that the digital internet platform was created by humans, who can exhibit both good and bad characteristics, and who form its primary user base. The sentiment is neutral, reflecting the recognition that the platform is a result of human actions.

In conclusion, there is a collective understanding that everyone is vulnerable online, and navigating the online world poses risks to everyone's safety. The complexity of regulating online content, driven by cultural disparities in understanding freedom of speech, creates challenges in establishing a safe online environment. Advertisers actively seek a secure environment for their ads, ensuring they are not associated with harmful content. Finally, the digital internet platform is shaped by human agency. It is imperative to employ a combination of economic and regulatory measures to guide individuals towards safer internet usage and address these concerns.

Julie Inman Grant

The discussion centres around the vital issue of online safety and the need to implement regulations and interventions to protect vulnerable communities. One of the challenges highlighted is the fact that the internet is global, while laws are enforced at national and local levels. This disparity creates a regulatory challenge that needs to be addressed.

Notably, marginalized communities are disproportionately targeted online, experiencing higher rates of online hate. Indigenous youth in Australia face three times as much online hate, while young Australians identifying as LGBTQI receive twice as much. Additionally, research shows that one in four people with disabilities have experienced online violence. These statistics underscore the urgent need for targeted measures to combat online hate and protect these vulnerable communities.

Regulators play a crucial role in this landscape, as they are responsible for harnessing the benefits of technology while ensuring the safety and well-being of citizens. Their mandate is to strike a balance that allows for the maximization of the internet's potential while safeguarding users from potential harm.

Prevention, education, and understanding the harm caused are all crucial in creating safer online spaces. The "3P model" of prevention, protection, and responsibility, as mentioned by Julie Inman Grant, serves as a framework for making online spaces safer for all users. This prevention approach involves understanding who is being harmed and how, and taking proactive steps to prevent harm from occurring in the first place.

Furthermore, it is argued that platforms should prioritize user safety from the beginning by designing their products with built-in protection. Grant advocates for a "Safety by Design" approach, likening it to having seatbelts and airbags in cars. This implies that safety measures should be integrated into the initial stages of product development, rather than being added later in response to harm.

Specific regulations are recommended for platforms that fail to remove harmful content. Grant's office, for instance, is involved in investigating reports of online harassment when platforms do not take action. It is noteworthy that once the office intervenes, 90% of cases see the content voluntarily taken down by platforms. The focus is on reducing mental distress for users by acting as quickly as possible.

There is a recognition of the power and influence of tech companies in shaping the online landscape. Grant emphasizes the need to counterbalance these influences to ensure online safety. It is remarked that tech companies should prioritize human rights and safety, and regulators and companies must collaborate to achieve this objective.

The issue of sexual extortion online is given significant attention, with a tripling of cases reported. Tools, such as the Image-based Abuse scheme, are being utilized to combat this problem, along with actions against individuals producing deepfake intimate images. Dealing with the increasing prevalence of sexual extortion is deemed a priority.

The need for online platforms to improve safety standards is underlined. The concept of "safety by design" is discussed in relation to platforms like Snap and Instagram, with a call to harden resources against organized criminals and to take action against fake and imposter accounts.

Transparency is deemed crucial for accountability in online platforms. The usage of transparency powers to understand platform practices, such as scanning for child sexual abuse or terrorist content, is seen as essential. Notably, action has been taken against certain platforms regarding online hate, showing the importance of transparency in holding platforms accountable.

The responsibility of companies for creating a safe online environment is stressed. Criticism is directed towards companies that reduce safety engineers and content moderators, as well as penalizing previously suspended users. The analogy of a road traffic situation is used to emphasize the danger of an unsafe environment.

Moreover, it is argued that safety design standards should be applied in the tech industry, just as they are in sectors like food safety and consumer products. Given that the internet is now an essential utility, safety by design is advocated for in tech development.

Considering all these points, the expanded summary highlights the various aspects related to online safety, regulations, and interventions. It emphasizes the need to protect vulnerable communities, the role of regulators, the importance of prevention and education, and the responsibilities of tech companies. It also recognizes the challenges posed by sexual extortion and the significance of transparency and accountability in holding platforms accountable. Ultimately, the objective is to create a safer online environment by prioritizing the well-being and safety of users.

Meredith Whittaker

The analysis addresses several concerns surrounding online platforms and tech companies, emphasizing the need to move beyond self-regulation. It argues that there is a lack of collaboration between tech companies and regulators, leaving the latter to manage the negative consequences of toxic business models. The sentiment is negative, highlighting that self-regulation among online platforms is no longer adequate.

Another key issue raised is the problematic nature of business models based on surveillance advertising and algorithmic amplification employed by tech companies. These models have transformed small autonomous groups into global platforms with uniform standards, and online harms are being exploited to further political agendas. The sentiment towards these business models is negative.

On a positive note, the analysis supports the idea that autonomous online communities require some form of moderation. The author draws from their experiences with self-moderated Usenet message groups and Google groups, arguing that platforms without moderation enable harmful communication.

Furthermore, the analysis stresses the importance of prioritizing safe communication and avoiding surveillance. Signal, an interpersonal communications app, is cited as an example of a platform that actively protects user privacy and avoids surveillance. The sentiment is positive towards this argument.

The analysis also highlights the potential for innovative regulation that does not rely on surveillance advertising. The General Data Protection Regulation (GDPR) is mentioned as a tool that could prohibit surveillance advertising. The sentiment towards this argument is negative, suggesting that traditional tools like privacy should be given greater priority in innovation.

There are concerns about the lack of strict accountability for Big Tech and the implications of mass surveillance. The absence of a federal privacy law in the US is pointed out, and calls to scan encrypted content are seen as undermining privacy. The use of Facebook messages as evidence leading to imprisonment is presented as evidence. The sentiment is negative, underlining the need for stronger accountability and privacy preservation.

The analysis discusses the UK's online safety bill, expressing concerns about certain clauses. It argues that undermining encryption through these clauses compromises privacy. Signal, a widely used private messaging service, is mentioned as an example of a platform that safeguards privacy through end-to-end encryption. The sentiment is negative, reflecting concern about the potential impact on online safety and privacy.

Lastly, the analysis emphasizes the importance of maintaining the integrity of encryption systems for privacy. The argument is made that adding a scanning system in front of encryption creates significant vulnerabilities. Encryption is described as a binary technology that either works entirely or not at all. The sentiment is positive, underscoring the crucial role of encryption in preserving privacy.

In conclusion, the analysis delves into various concerns related to online platforms and tech companies, calling for collaborative efforts between tech companies and regulators. It criticizes business models based on surveillance advertising and algorithmic amplification, highlights the importance of moderation in autonomous online communities, and advocates for safe communication and privacy preservation. It also emphasizes the need for innovative regulation and the protection of privacy amidst mass surveillance. The analysis warns against undermining encryption technologies and emphasizes their role in safeguarding privacy.

Helena Leurent

The analysis explores various aspects of consumer protection, privacy, and accountability in the digital marketplace. It argues for the need to extend the scope of consumer protection beyond vulnerable groups and emphasises the urgency in addressing consumer protection in the digital marketplace.

One of the key points raised is the notion that consumer law often treats everyone as the average, rational, reasonable person, failing to recognise that anyone can be vulnerable at some point. The analysis suggests that it is essential to understand and acknowledge the vulnerability that can occur in different consumer situations.

Another important aspect discussed is the accountability of businesses and government regarding user data and information. The analysis asserts that both entities should be held responsible for ensuring the privacy, security, and proper handling of user data. It highlights the significance of businesses and government taking action, from providing information to individuals to implementing systems and processes for accountability.

The analysis further advocates for broadening consumer protection beyond vulnerable groups in order to ensure a more comprehensive approach. It mentions discussions at the G7 about data flow and redress measures, which are vital in building trust in the digital marketplace. This suggests that policymakers are recognising the need for a broader focus on consumer protection and are actively working towards implementing measures to address these concerns.

Concerns about consumer anxiety and lack of control in the digital marketplace are also raised. The analysis presents examples, such as marketing studies showing how targeted ads can increase anxiety and instances of consumers with gambling issues receiving ads for casinos. These examples illustrate the lack of control and visibility that consumers may experience in the digital marketplace, which contributes to their anxiety.

The importance of legislation around online safety is highlighted, with only 60% of countries having such legislation in place. This suggests a pressing need for governments to establish and enforce regulations that protect consumers online.

The analysis also emphasises the consumer's need for privacy, which is recognised at the United Nations and should be an integral part of every country's consumer policy. This indicates the growing recognition and importance of safeguarding consumer privacy.

The challenge of managing personalized pricing is discussed, with examples provided of consumers being charged significantly more for the same product. This raises concerns about fair pricing and the transparency of pricing practices in the digital marketplace.

The lack of public trust in enforcement agencies regarding scams is addressed, with scams representing a significant problem, amounting to 1% of GDP. The analysis notes that a significant portion of scam victims do not report them due to a belief in the inaction of enforcement agencies. This suggests a need for increased public trust and more effective enforcement measures to combat scams.

The analysis concludes by advocating for citizen representation in tech and government power conversations. It asserts that citizens should be at the center of discussions, and that enforcement agencies need to take action. This calls for a shift towards more inclusive and participatory decision-making processes.

Lastly, the analysis emphasises the need for business models that align with a greener, more sustainable future across sectors. It acknowledges that the conversation about sustainable business models is applicable to every sector and imperative for building a more sustainable marketplace.

Overall, the analysis sheds light on various issues surrounding consumer protection, privacy, and accountability in the digital marketplace. It calls for a comprehensive approach to consumer protection, improved privacy measures, enhanced enforcement of regulations, and inclusive decision-making processes. These insights provide valuable considerations for policymakers, businesses, and consumers alike in navigating the digital marketplace and ensuring a fair, secure, and sustainable environment for all.

Shereen Bhan

The analysis reveals several crucial points regarding online safety and regulations. Firstly, it highlights the fact that different jurisdictions have different approaches when it comes to online safety. This variation in approaches emphasizes the need for more collaboration and partnership among regulators, governments, the private sector, and civil society. By working together, these stakeholders can establish common standards and practices to ensure the safety of online users.

Furthermore, the World Economic Forum's risk report identifies disinformation and misinformation as the number one global risk in the next two years. This finding highlights the urgent need to address and combat the spread of false information online. It underscores the importance of implementing effective measures to counter disinformation, such as fact-checking systems and promoting media literacy.

The analysis also addresses the challenge of balancing freedom of speech, privacy, and acceptable behavior online. This issue poses a challenge worldwide, as finding the right balance between these aspects is a complex task. Governments, regulators, and online platforms must navigate this challenge to maintain a safe and open online environment while respecting individuals' privacy and freedom of expression.

One significant concern raised in the analysis is the potential threats posed by artificial intelligence (AI) and digitization. The widespread availability of information related to AI and digitization increases the perceived vulnerability and risk of misinformation, disinformation, and deep fakes. This highlights the need for robust regulations and safeguards to mitigate these risks and protect online users.

Advocacy for the harmonization of rules and regulations across different sectors and jurisdictions is another important point underscored in the analysis. Implementing coherent and coordinated frameworks is vital to ensure adequate protection for online users. Greater cohesiveness and cooperation between various stakeholders are essential to address the challenges posed by the digital landscape effectively.

The analysis also raises concerns about the potential consequences of certain clauses in the online safety bill. These clauses, if enforced, could mandate mass scanning of end-to-end encrypted messaging, possibly compromising individual privacy. This highlights the importance of considering the potential impact of regulations on citizens' rights and privacy.

Another noteworthy aspect of the analysis is the emphasis placed on citizen rights in the conversation surrounding tech regulation. It suggests that the citizen should be the central focus when discussing and developing regulations to ensure that their rights are protected and upheld.

Privacy regulations are being addressed by tech companies, governments, and regulators. However, the analysis emphasizes the need to prioritize the citizen's benefit and well-being in these discussions. Inclusive and comprehensive conversations that involve the perspectives and interests of all stakeholders are essential to create effective privacy regulations.

In conclusion, the analysis provides valuable insights into the complex landscape of online safety and regulation. It highlights the importance of collaboration among stakeholders, the urgency to combat disinformation, the challenges in balancing freedom of speech and privacy, the threats posed by AI and digitization, the need for harmonization of rules and regulations, and the imperative to prioritize citizen rights and the citizen's benefit in conversations around tech regulation and privacy. These points serve as a call to action for policymakers, regulators, and society as a whole to address these issues and work towards a safer and more secure digital environment.

HL

Helena Leurent

Speech speed

159 words per minute

Speech length

1031 words

Speech time

388 secs

JI

Julie Inman Grant

Speech speed

171 words per minute

Speech length

1848 words

Speech time

647 secs

ML

Maurice Lévy

Speech speed

124 words per minute

Speech length

1311 words

Speech time

636 secs

MW

Meredith Whittaker

Speech speed

172 words per minute

Speech length

1729 words

Speech time

602 secs

SB

Shereen Bhan

Speech speed

198 words per minute

Speech length

1673 words

Speech time

507 secs