EU launches investigation into Facebook and Instagram over child safety

The EU regulators announced on Thursday that Meta Platforms’ social media platforms, Facebook and Instagram, will undergo investigation for potential violations of the EU online content rules about child safety, potentially resulting in significant fines. The scrutiny follows the EU’s implementation of the Digital Services Act (DSA) last year, which places greater responsibility on tech companies to address illegal and harmful content on their platforms.

The European Commission has expressed concerns that Facebook and Instagram have not adequately addressed risks to children, prompting an in-depth investigation. Issues highlighted include the potential for the platforms’ systems and algorithms to promote behavioural addictions among children and facilitate access to inappropriate content, leading to what the Commission refers to as ‘rabbit-hole effects’. Additionally, concerns have been raised regarding Meta’s age assurance and verification methods.

Why does it matter?

Meta, formerly known as Facebook, is already under the EU scrutiny over election disinformation, particularly concerning the upcoming European Parliament elections. Violations of the DSA can result in fines of up to 6% of a company’s annual global turnover, indicating the seriousness with which the EU regulators are approaching these issues. Meta’s response to the investigation and any subsequent actions will be closely monitored as the EU seeks to enforce stricter regulations on tech giants to protect online users, especially children, from harm.

EU designates Booking as a gatekeeper under DMA

The European Commission announced on Monday that it has classified Booking as a ‘gatekeeper’ under the Digital Markets Act (DMA), signifying its strong market influence. At the same time, the Commission has initiated a market investigation into the regulatory status of social media network X to delve deeper into its market dominance. Despite this, according to the EU, online advertising services such as X Ads and TikTok Ads have not been designated as gatekeepers.

In March, the European Commission identified Elon Musk’s X, TikTok’s parent company ByteDance, and Booking.com as potential candidates for gatekeeper status, subjecting them to stringent tech regulations. While Booking has been officially designated as a gatekeeper, a market investigation has been initiated to address X’s opposition to such a classification. ByteDance was previously labelled as a gatekeeper in July last year, but TikTok has contested this designation at the EU’s second-highest court.

Why does it matter?

The Digital Markets Act (DMA) represents a significant step towards regulating the market dominance of large tech companies. It imposes stricter obligations on these firms, compelling them to moderate content, ensure fair competition, and facilitate consumer choice by making it easier to switch between services. As the EU continues to navigate the complexities of digital market regulation, the classification of gatekeepers and subsequent investigations serve as crucial measures to promote fair competition and protect consumers’ interests in the digital sphere.

EU seeks details on X’s content moderation practices

The European Commission has taken a significant step in its investigation of company X under the Digital Services Act (DSA). On 8 May 2024, the Commission sent a request for information (RFI) to X, seeking detailed insights into its content moderation practices, particularly in light of a recent Transparency report highlighting a nearly 20% reduction in X’s content moderation team since October 2023. The reduction has diminished linguistic coverage within the EU, from 11 languages to 7.

Furthermore, the European Commission is keen on understanding X’s risk assessments and mitigation strategies concerning generative AI tools, especially their potential impact on electoral processes, dissemination of illegal content, and protection of fundamental rights. The investigation follows formal proceedings initiated against X in December 2023, examining potential breaches of the DSA related to risk management, content moderation, dark patterns, advertising transparency, and data access for researchers.

The request for information is part of an ongoing investigation, building upon prior evidence gathering and analysis, including X’s Transparency report released in March 2024 and its responses to previous information requests. X has been given deadlines to provide the requested information, with 17 May 2024 set for content moderation resources and generative AI-related data and 27 May 2024 for remaining inquiries. Failure to comply could result in fines or penalties imposed by the Commission, as stipulated under Article 74(2) of the DSA.

Meta platforms face a probe by EU for disinformation handling

The EU regulators are gearing up to launch an investigation into Meta Platforms amid concerns regarding the company’s efforts to combat disinformation, mainly from Russia and other nations. According to a report by the Financial Times, the EU regulators are alarmed by Meta’s purported inadequacy in curbing the spread of political advertisements that could undermine the integrity of electoral processes. Citing sources familiar with the matter, the report suggests that Meta’s content moderation measures might need to address this issue more effectively.

While the investigation is expected to be initiated imminently, the European Commission is anticipated to refrain from explicitly targeting Russia in its official statement. Instead, the focus will be on the broader problem of foreign actors manipulating information. Meta Platforms and the European Commission have yet to respond to requests for comment, indicating the gravity and sensitivity of the impending probe.ž

Why does it matter?

The timing of the investigation coincides with a significant year for elections across the globe, with numerous countries, including UK, Austria, and Georgia, preparing to elect new leaders. Additionally, the European Parliament elections are slated for June, heightening the urgency for regulatory scrutiny over platforms like Meta. This development underscores the growing concern among regulators regarding the influence of disinformation on democratic processes, prompting concerted efforts to address these challenges effectively.

ByteDance submits overdue risk assessment for TikTok Lite amid regulatory pressure

ByteDance, the company behind TikTok, has submitted a long-awaited risk assessment for its TikTok Lite service, recently launched in France and Spain, following regulatory threats of fines and potential bans from the European Commission. Regulators are concerned about the addictive nature of TikTok Lite, particularly its rewards system for users, and claim ByteDance didn’t complete a full risk assessment on time.

ByteDance now has until 24 April to defend itself against regulatory action, including possibly suspending the rewards program. Failure to comply with regulations could result in fines of up to 1% of its total annual income or periodic penalties of up to 5% of its average daily income under the Digital Services Act (DSA).

The DSA imposes strict rules on online platforms with over 45 million users in the EU, including other major tech companies like Google, Facebook, Instagram, and LinkedIn.

Why does it matter?

Meanwhile, in the US, legislation is swiftly advancing through Congress, requiring ByteDance, the Chinese company that owns TikTok, to divest its ownership within a year or face a US ban. The Senate has passed this measure as part of a foreign aid package, sending it to President Joe Biden for his expected approval. ByteDance will have nine months initially, with a possible three-month extension, to complete the sale, though legal challenges could cause delays.

EU threatens TikTok Lite suspension over mental health concerns

The European Commission has warned TikTok that it may suspend a key feature of TikTok Lite in the European Union on Thursday if the company fails to address concerns regarding its impact on users’ mental health. This action is being taken under the EU’s Digital Services Act (DSA), which mandates that large online platforms take action against harmful content or face fines of up to 6% of their global annual turnover.

Thierry Breton, the EU industry chief, emphasised the Commission’s readiness to implement interim measures, including suspending TikTok Lite, if TikTok does not provide compelling evidence of the feature’s safety. Breton highlighted concerns about potential addiction generated by TikTok Lite’s reward program.

TikTok has been given a 24-hour deadline to provide a risk assessment report on TikTok Lite to avoid fines and additional requested information by 3 May to avoid penalties. Despite these demands, TikTok still needs to respond to the Commission’s requests for comment.

The TikTok Lite app, recently launched in France and Spain, includes a reward program where users earn points by engaging in specific tasks on the platform. However, TikTok should have submitted a risk assessment report before the app’s launch, as required by the DSA. The Commission remains firm on enforcing regulations to protect users’ well-being amidst the growing influence of digital platforms.

EU demands adult content platforms to assess risks

Three major adult content platforms, Pornhub, Stripchat, and XVideos, are required to conduct risk assessments and implement measures to address systemic risks associated with their services under new EU online content rules announced by the European Commission. These companies were classified as very large online platforms in December under the Digital Services Act (DSA), which demands heightened efforts to remove illegal and harmful content from their platforms.

The EU executive specified that Pornhub and Stripchat must comply with these rigorous DSA obligations by 21 April, while XVideos has until 23 April to do the same. These obligations include submitting risk assessment reports to the Commission and implementing mitigation measures to tackle systemic risks linked to their services. Additionally, the platforms are expected to adhere to transparency requirements related to advertisements and provide researchers with data access.

Failure to comply with the DSA regulations could lead to significant penalties, with companies facing fines of up to 6% of their global annual turnover for breaches. The European Commission’s actions underscore its commitment to ensuring that large online platforms take proactive steps to address illegal and harmful content, particularly within the context of adult content services. These measures are part of broader efforts to enhance online safety and accountability across digital platforms operating within the EU.

Deepfake controversy surrounds Le Pen family ahead of EU elections

Deepfake videos depicting fictitious members of the Le Pen family surfaced online, stirring controversy as France’s far-right parties gear up for the upcoming EU elections. These videos, featuring fabricated personas engaging in provocative behaviour and promoting far-right agendas, spread rapidly on platforms like TikTok.

Despite efforts to delete some of these accounts, the videos garnered millions of views before being flagged. The Le Pen family expressed discontent, while the ‘Reconquête!’ party, one of the implicated groups, reported the content to TikTok. Ironically, politicians who opposed such measures earlier over concerns about ‘authoritarian measures’ now find themselves at odds with the consequences.

Despite delays in France’s implementation of the Digital Services Act (DSA), which suggests a gap in addressing the spread of deepfakes, companies are bound by the DSA through direct EU application. Nevertheless, concerns over the integrity of elections persist, with platforms and policymakers striving to combat deceptive AI use.

Why does it matter? 

The need to combat deepfakes has become increasingly apparent in light of recent events, notably the dissemination of a manipulated audio clip depicting a political figure confessing to election manipulation in Slovakia. Despite the 17 February 2024 deadline, many EU countries, including France, need to catch up in establishing key administrative bodies to address this issue. These bodies are tasked to designate ‘trusted flaggers’, organisations that play a vital role in identifying and flagging deceptive content to platforms.

EU leaders consider sanctions in response to suspected Russian election interference

European Union leaders convened to address growing concerns regarding suspected Russian interference across the bloc in the forthcoming June elections. As Brussels escalates its warnings about disinformation campaigns, the EU leaders are deliberating on the potential imposition of sanctions targeting Moscow’s activities. Allegations of the EU lawmakers receiving payments to disseminate Kremlin propaganda have intensified the urgency for decisive action.

The EU leaders have pledged to closely monitor and mitigate risks of foreign interference in electoral processes. The commitment includes the establishment of a joint task force to monitor developments and coordinate with national authorities. However, Russia-friendly leaders such as Hungary’s Viktor Orban and Slovakia’s Robert Fico signalled a slim likelihood of the EU leaders taking more assertive actions before the elections.

Why does it matter? 

Identified Russian disinformation tactics by the EU officials involve blending facts with false narratives to sow confusion among readers. Sanctions have been imposed on entities spreading Russian propaganda early this year. At the same time, the EU lawmakers under suspicion face scrutiny amid ongoing investigations into foreign influence, with calls for the European Public Prosecutor’s Office and the European Anti-Fraud Office to intervene against political meddling.

EU users can now download iOS apps directly from developers

Apple is rolling out a significant change in its approach to distributing iOS apps in the EU. Starting Tuesday, developers will be able to offer apps for direct download from their websites. This move breaks from Apple’s traditional walled garden model and responds to new EU regulations to foster competition and consumer protection in digital markets.

Under these changes, developers meeting Apple’s criteria, including notarization requirements, can distribute iPhone apps directly to the EU users. However, this comes with new terms, including a ‘core technology fee’ of €0.50 for each first annual install over 1 million, regardless of distribution location.

The company has also made other adjustments in compliance with the Digital Markets Act (DMA), such as allowing marketplace apps where developers can run their own app stores on iOS and offering greater flexibility in in-app payments. However, Apple maintains its stance on security risks associated with sideloading apps, emphasising safety measures in the new distribution process.

Critics have raised concerns about the authorisation flow for direct web downloads, labelling them as ‘scare screens’ designed to discourage users from bypassing Apple’s App Store. The European Commission is investigating several aspects of Apple’s compliance with the DMA, including its fee structure and steering rules.

Why does it matter?

While this shift opens up new avenues for developers to reach users in the EU, its adoption remains to be determined. Apple acknowledges some interest from developers but emphasises that it’s a new capability, and the extent of its adoption is yet to be seen. This move adds to the evolving landscape of app distribution options in the EU alongside the existing App Store distribution and marketplace app submissions.