Meta platforms face a probe by EU for disinformation handling

The EU regulators are gearing up to launch an investigation into Meta Platforms amid concerns regarding the company’s efforts to combat disinformation, mainly from Russia and other nations. According to a report by the Financial Times, the EU regulators are alarmed by Meta’s purported inadequacy in curbing the spread of political advertisements that could undermine the integrity of electoral processes. Citing sources familiar with the matter, the report suggests that Meta’s content moderation measures might need to address this issue more effectively.

While the investigation is expected to be initiated imminently, the European Commission is anticipated to refrain from explicitly targeting Russia in its official statement. Instead, the focus will be on the broader problem of foreign actors manipulating information. Meta Platforms and the European Commission have yet to respond to requests for comment, indicating the gravity and sensitivity of the impending probe.ž

Why does it matter?

The timing of the investigation coincides with a significant year for elections across the globe, with numerous countries, including UK, Austria, and Georgia, preparing to elect new leaders. Additionally, the European Parliament elections are slated for June, heightening the urgency for regulatory scrutiny over platforms like Meta. This development underscores the growing concern among regulators regarding the influence of disinformation on democratic processes, prompting concerted efforts to address these challenges effectively.

ByteDance submits overdue risk assessment for TikTok Lite amid regulatory pressure

ByteDance, the company behind TikTok, has submitted a long-awaited risk assessment for its TikTok Lite service, recently launched in France and Spain, following regulatory threats of fines and potential bans from the European Commission. Regulators are concerned about the addictive nature of TikTok Lite, particularly its rewards system for users, and claim ByteDance didn’t complete a full risk assessment on time.

ByteDance now has until 24 April to defend itself against regulatory action, including possibly suspending the rewards program. Failure to comply with regulations could result in fines of up to 1% of its total annual income or periodic penalties of up to 5% of its average daily income under the Digital Services Act (DSA).

The DSA imposes strict rules on online platforms with over 45 million users in the EU, including other major tech companies like Google, Facebook, Instagram, and LinkedIn.

Why does it matter?

Meanwhile, in the US, legislation is swiftly advancing through Congress, requiring ByteDance, the Chinese company that owns TikTok, to divest its ownership within a year or face a US ban. The Senate has passed this measure as part of a foreign aid package, sending it to President Joe Biden for his expected approval. ByteDance will have nine months initially, with a possible three-month extension, to complete the sale, though legal challenges could cause delays.

EU threatens TikTok Lite suspension over mental health concerns

The European Commission has warned TikTok that it may suspend a key feature of TikTok Lite in the European Union on Thursday if the company fails to address concerns regarding its impact on users’ mental health. This action is being taken under the EU’s Digital Services Act (DSA), which mandates that large online platforms take action against harmful content or face fines of up to 6% of their global annual turnover.

Thierry Breton, the EU industry chief, emphasised the Commission’s readiness to implement interim measures, including suspending TikTok Lite, if TikTok does not provide compelling evidence of the feature’s safety. Breton highlighted concerns about potential addiction generated by TikTok Lite’s reward program.

TikTok has been given a 24-hour deadline to provide a risk assessment report on TikTok Lite to avoid fines and additional requested information by 3 May to avoid penalties. Despite these demands, TikTok still needs to respond to the Commission’s requests for comment.

The TikTok Lite app, recently launched in France and Spain, includes a reward program where users earn points by engaging in specific tasks on the platform. However, TikTok should have submitted a risk assessment report before the app’s launch, as required by the DSA. The Commission remains firm on enforcing regulations to protect users’ well-being amidst the growing influence of digital platforms.

EU demands adult content platforms to assess risks

Three major adult content platforms, Pornhub, Stripchat, and XVideos, are required to conduct risk assessments and implement measures to address systemic risks associated with their services under new EU online content rules announced by the European Commission. These companies were classified as very large online platforms in December under the Digital Services Act (DSA), which demands heightened efforts to remove illegal and harmful content from their platforms.

The EU executive specified that Pornhub and Stripchat must comply with these rigorous DSA obligations by 21 April, while XVideos has until 23 April to do the same. These obligations include submitting risk assessment reports to the Commission and implementing mitigation measures to tackle systemic risks linked to their services. Additionally, the platforms are expected to adhere to transparency requirements related to advertisements and provide researchers with data access.

Failure to comply with the DSA regulations could lead to significant penalties, with companies facing fines of up to 6% of their global annual turnover for breaches. The European Commission’s actions underscore its commitment to ensuring that large online platforms take proactive steps to address illegal and harmful content, particularly within the context of adult content services. These measures are part of broader efforts to enhance online safety and accountability across digital platforms operating within the EU.

Deepfake controversy surrounds Le Pen family ahead of EU elections

Deepfake videos depicting fictitious members of the Le Pen family surfaced online, stirring controversy as France’s far-right parties gear up for the upcoming EU elections. These videos, featuring fabricated personas engaging in provocative behaviour and promoting far-right agendas, spread rapidly on platforms like TikTok.

Despite efforts to delete some of these accounts, the videos garnered millions of views before being flagged. The Le Pen family expressed discontent, while the ‘Reconquête!’ party, one of the implicated groups, reported the content to TikTok. Ironically, politicians who opposed such measures earlier over concerns about ‘authoritarian measures’ now find themselves at odds with the consequences.

Despite delays in France’s implementation of the Digital Services Act (DSA), which suggests a gap in addressing the spread of deepfakes, companies are bound by the DSA through direct EU application. Nevertheless, concerns over the integrity of elections persist, with platforms and policymakers striving to combat deceptive AI use.

Why does it matter? 

The need to combat deepfakes has become increasingly apparent in light of recent events, notably the dissemination of a manipulated audio clip depicting a political figure confessing to election manipulation in Slovakia. Despite the 17 February 2024 deadline, many EU countries, including France, need to catch up in establishing key administrative bodies to address this issue. These bodies are tasked to designate ‘trusted flaggers’, organisations that play a vital role in identifying and flagging deceptive content to platforms.

EU leaders consider sanctions in response to suspected Russian election interference

European Union leaders convened to address growing concerns regarding suspected Russian interference across the bloc in the forthcoming June elections. As Brussels escalates its warnings about disinformation campaigns, the EU leaders are deliberating on the potential imposition of sanctions targeting Moscow’s activities. Allegations of the EU lawmakers receiving payments to disseminate Kremlin propaganda have intensified the urgency for decisive action.

The EU leaders have pledged to closely monitor and mitigate risks of foreign interference in electoral processes. The commitment includes the establishment of a joint task force to monitor developments and coordinate with national authorities. However, Russia-friendly leaders such as Hungary’s Viktor Orban and Slovakia’s Robert Fico signalled a slim likelihood of the EU leaders taking more assertive actions before the elections.

Why does it matter? 

Identified Russian disinformation tactics by the EU officials involve blending facts with false narratives to sow confusion among readers. Sanctions have been imposed on entities spreading Russian propaganda early this year. At the same time, the EU lawmakers under suspicion face scrutiny amid ongoing investigations into foreign influence, with calls for the European Public Prosecutor’s Office and the European Anti-Fraud Office to intervene against political meddling.

EU users can now download iOS apps directly from developers

Apple is rolling out a significant change in its approach to distributing iOS apps in the EU. Starting Tuesday, developers will be able to offer apps for direct download from their websites. This move breaks from Apple’s traditional walled garden model and responds to new EU regulations to foster competition and consumer protection in digital markets.

Under these changes, developers meeting Apple’s criteria, including notarization requirements, can distribute iPhone apps directly to the EU users. However, this comes with new terms, including a ‘core technology fee’ of €0.50 for each first annual install over 1 million, regardless of distribution location.

The company has also made other adjustments in compliance with the Digital Markets Act (DMA), such as allowing marketplace apps where developers can run their own app stores on iOS and offering greater flexibility in in-app payments. However, Apple maintains its stance on security risks associated with sideloading apps, emphasising safety measures in the new distribution process.

Critics have raised concerns about the authorisation flow for direct web downloads, labelling them as ‘scare screens’ designed to discourage users from bypassing Apple’s App Store. The European Commission is investigating several aspects of Apple’s compliance with the DMA, including its fee structure and steering rules.

Why does it matter?

While this shift opens up new avenues for developers to reach users in the EU, its adoption remains to be determined. Apple acknowledges some interest from developers but emphasises that it’s a new capability, and the extent of its adoption is yet to be seen. This move adds to the evolving landscape of app distribution options in the EU alongside the existing App Store distribution and marketplace app submissions.

EU cybersecurity label vote postponed

National cybersecurity experts have postponed a vote on a proposed EU cybersecurity label until May, according to sources familiar with the matter. The EU aims to implement a cybersecurity certification scheme (EUCS) to ensure the security of cloud services, aiding governments and businesses in selecting trustworthy vendors. This delay allows tech giants like Amazon, Google, and Microsoft to continue bidding for sensitive EU cloud computing contracts.

Disagreements have arisen over whether strict requirements should be imposed on major tech companies to qualify for the highest level of the EU cybersecurity label. These disagreements have stalled progress despite recent discussions among experts in Brussels. Holding the rotating EU presidency, Belgium has made adjustments to the draft, reflecting ongoing deliberations.

The most recent version of the draft has eliminated sovereignty requirements that previously mandated US tech giants to collaborate with EU-based companies to handle customer data in the bloc. While major tech firms have welcomed this change, it has drawn criticism from EU-based cloud vendors and businesses like Deutsche Telekom, Orange, and Airbus. They argue that removing these requirements poses a risk of unauthorised data access by non-EU governments under their respective laws.

Following the experts’ postponed vote, the next phase involves the EU countries providing input, with the European Commission making the final decision. The outcome of these discussions will significantly impact the landscape of cybersecurity regulations and the involvement of major tech players in the EU’s cloud computing sector.

EU considers stricter rules for Chinese e-commerce platform Temu

A significant shift in oversight may be on the horizon for the popular Chinese e-commerce platform Temu as it discloses its monthly European user base of 75 million. Temu, an extension of China’s Pinduoduo, has attracted attention for its privacy and cybersecurity practices, facing recent scrutiny in Germany for alleged consumer misrepresentation. Although Temu was already subject to specific Digital Services Act (DSA) regulations, its newly declared scale could trigger more stringent obligations under European Commission supervision.

Online platforms exceeding 45 million users within the EU must undergo external audits and risk assessments, evaluating their efforts to combat illegal content like counterfeit goods or hazardous products. Violations could lead to fines amounting to 6% of their global turnover. The European Commission is tasked with formally designating such major online platforms, providing four months for companies to prepare for intensified regulations. When queried about Temu’s potential designation, the Commission remained non-committal but acknowledged engagement with the platform.

Meanwhile, the European Commission is gearing up to designate Shein, a Chinese fashion platform boasting over 108 million European users, further signalling the EU’s intent to assert regulatory oversight over large-scale digital platforms. The move underscores the EU’s commitment to enforcing stricter content moderation rules and consumer protection measures within the digital sphere, particularly in response to the expansive reach and influence of major online platforms like Temu and Shein.

EU antitrust probe into Microsoft’s OpenAI investment nears conclusion

The EU regulators are swiftly moving to conclude a preliminary investigation into Microsoft’s relationship with OpenAI, according to Margrethe Vestager, the EU’s antitrust chief. The probe, initiated in January, aims to determine whether Microsoft’s substantial investment of $13 billion into OpenAI should undergo scrutiny under the EU merger regulations. Vestager indicated in an interview with Bloomberg TV that a resolution is forthcoming, highlighting ongoing discussions with other regulatory authorities.

Vestager emphasised that the EU authorities closely monitor Microsoft’s investments and the broader trend of large tech companies investing in AI. The scrutiny extends beyond Microsoft to include other significant AI investments from major tech firms like Google, Amazon, and Nvidia. The EU mainly ensures competitiveness and prevents anti-competitive practices in this rapidly evolving AI landscape.

Microsoft’s involvement with OpenAI represents a significant stake, with the tech giant investing in other AI ventures, such as French startup Mistral and acquiring the team from Inflection AI. This investment landscape extends to other major players like Google and Amazon, which have their stakes in AI ventures. Vestager stressed the importance of vigilance in this emerging field, characterising it as a critical area for regulatory oversight to safeguard competition and innovation in the AI sector.