Workplace app discontinued as Meta invests in AI and metaverse

Meta Platforms, the parent company of Facebook, announced that it will discontinue its Workplace app, a platform geared towards work-related communications. The social media platform made this decision as it shifted its focus towards developing AI and metaverse technologies. The Workplace app will be phased out for customers starting in June 2026, although Meta will continue to utilise it internally as a messaging board until August 2025, according to a statement from the company.

A spokesperson for Meta stated that they are discontinuing Workplace to focus on building AI and metaverse technologies that they believe will fundamentally reshape the way they work. Over the next two years, Workplace customers will have the option to transition to Zoom’s Workvivo product, which Meta has designated as its preferred migration partner. Workplace was initially launched in 2016 to cater to businesses, offering features such as multi-company groups and shared spaces to facilitate collaboration among employees from different organizations.

Why does it matter?

The discontinuation of Workplace aligns with Meta’s strategic emphasis on advancing AI and metaverse technologies, which it views as integral to the future of digital communication. The strategic change of business direction has raised concerns about escalating costs that could potentially impact the company’s growth trajectory. Despite the discontinuation of Workplace, Meta has assured customers that billing and payment arrangements will remain unchanged until August of this year. Currently, Workplace offers a core plan priced at $4 per user per month, with additional add-ons available starting from $2 per user per month, with monthly bills calculated based on the number of billable users unless a fixed plan is in place.

Malaysia condemns Meta for removing posts on prime minister’s meeting with Hamas leader

Malaysia’s communications minister has criticised Meta Platforms for removing Facebook posts by local media covering Prime Minister Anwar Ibrahim’s meeting with a Hamas leader in Qatar. Anwar clarified that while he has diplomatic relations with Hamas’s political leadership, he is not involved in its military activities.

Expressing Malaysia’s support for the Palestinian cause, the government has asked Meta to explain the removal of posts by two media outlets about Anwar’s meeting. Additionally, a Facebook account covering Palestinian issues was closed.

Communications Minister Fahmi Fadzil condemned Meta’s actions, noting the posts’ relevance to the prime minister’s official visit to Qatar. He emphasised concerns about Meta’s disregard for media freedom.

Last October, Fahmi warned of potential actions against Meta and other social media platforms if they obstructed pro-Palestinian content since Malaysia consistently advocates for a two-state solution to the Israel-Palestine conflict.

Meta Platforms faces heavy fine in Turkey over data-sharing

Turkey’s competition board has levied a substantial fine of 1.2 billion lire ($37.20 million) against Meta Platforms following investigations into data-sharing practices across its social media platforms, including Facebook, Instagram, WhatsApp, and Threads. The board launched an inquiry last December, particularly focusing on potential competition law violations related to integrating Threads and Instagram.

As part of its findings, the competition board imposed an interim measure in March to restrict data sharing between Threads and Instagram. In response, Meta announced the temporary shutdown of Threads in Turkey to comply with the interim order, reflecting the company’s efforts to adhere to regulatory directives.

The fine encompasses two separate investigations, with 898 million lira attributed to the compliance process and investigations related to Facebook, Instagram, and WhatsApp, and an additional 336 million lira for the inquiry into Threads. The board’s decision emphasises the importance of user consent and notification regarding data usage, ensuring transparency and control over personal data across Meta’s platforms.

Previously, the competition board had imposed fines on Meta, including daily penalties for insufficient documentation and notifications about data-sharing. While these penalties concluded on 3 May 2024, the recent fine extends the ongoing regulatory scrutiny over Meta’s business practices, echoing similar actions taken by regulatory authorities globally to ensure compliance with competition and data protection laws.

Meta found displaying explicit ‘AI Girlfriend’ ads, violating advertising policies

Meta-owned social media platforms, including Facebook, Instagram, and Messenger, have reportedly displayed explicit ads for ‘AI girlfriends,’ violating the company’s advertising policies—an investigation by Wired uncovered over 29,000 instances of such ads in Meta’s ad library. They feature chatbots sending sexually suggestive messages and AI-generated images of women in provocative poses, often without the ‘NSFW’ (Not Safe for Work) label. These instances have raised concerns about user’s exposure to inappropriate content.

Despite prohibiting adult content in advertising, including nudity and sexually explicit activities, about half of the identified ads breached Meta’s policies. Ryan Daniels, a Meta spokesperson, stated that the company is working to remove these violating ads promptly and continuously improving detection systems. However, he acknowledged various attempts to circumvent their current policies and detection methods.

Why does it matter?

Sex workers, sex educators, LGBTQ users, and erotic artists have long claimed that Meta unfairly targets their content, as reported by Mashable. They argue that Instagram shadowbans LGBTQ and sex educator accounts, while WhatsApp bans sex worker accounts.

Another controversial incident occurred last November when Mashable reported that Meta rejected a period care ad as ‘adult or political.’ Meanwhile, NSFW ‘AI girlfriend’ ads appear to be slipping through Meta’s advertising policies, sparking discussions about selective enforcement.

EU probes Meta platforms for deceptive ads

The European Commission has launched an investigation into Meta Platforms’ Facebook and Instagram over suspected failures to combat deceptive advertising and disinformation ahead of the European Parliament elections. Concerns have arisen not only about external sources like Russia, China, and Iran but also within the EU, with political parties and organisations resorting to false information to sway voters in the June 6-9 elections.

Under the Digital Services Act (DSA), big tech companies must take stronger measures against illegal and harmful content on their platforms or face fines of up to 6% of their global annual turnover. EU digital chief Margrethe Vestager expressed concerns about Meta’s moderation practices and transparency regarding advertisement and content moderation procedures, prompting the Commission to initiate proceedings to assess Meta’s compliance with the DSA.

Meta, with over 250 million monthly active users in the EU, defended its risk-mitigating process but faced suspicion from the Commission regarding its compliance with DSA obligations. Specific concerns include Meta’s handling of deceptive advertisements, disinformation campaigns, coordinated inauthentic behaviour, and the absence of an effective third-party real-time civic discourse and election-monitoring tool ahead of the European Parliament elections.

The European Commission also raised issues regarding Meta’s decision to phase out its disinformation-tracking tool, CrowdTangle, without a suitable replacement. Meta now has five working days to inform the EU about any remedial actions to address the Commission’s concerns, signalling a pivotal moment in the ongoing battle against online misinformation and harmful content ahead of significant electoral events.

Meta platforms face a probe by EU for disinformation handling

The EU regulators are gearing up to launch an investigation into Meta Platforms amid concerns regarding the company’s efforts to combat disinformation, mainly from Russia and other nations. According to a report by the Financial Times, the EU regulators are alarmed by Meta’s purported inadequacy in curbing the spread of political advertisements that could undermine the integrity of electoral processes. Citing sources familiar with the matter, the report suggests that Meta’s content moderation measures might need to address this issue more effectively.

While the investigation is expected to be initiated imminently, the European Commission is anticipated to refrain from explicitly targeting Russia in its official statement. Instead, the focus will be on the broader problem of foreign actors manipulating information. Meta Platforms and the European Commission have yet to respond to requests for comment, indicating the gravity and sensitivity of the impending probe.ž

Why does it matter?

The timing of the investigation coincides with a significant year for elections across the globe, with numerous countries, including UK, Austria, and Georgia, preparing to elect new leaders. Additionally, the European Parliament elections are slated for June, heightening the urgency for regulatory scrutiny over platforms like Meta. This development underscores the growing concern among regulators regarding the influence of disinformation on democratic processes, prompting concerted efforts to address these challenges effectively.

AI ‘girlfriend’ ads raise concerns on Meta platforms

Meta’s integration of AI across its platforms, including Facebook, Instagram, and WhatsApp, has raised concerns as Wired reports the proliferation of explicit ads for AI ‘girlfriends’ on these platforms. The investigation found tens of thousands of such ads violating Meta’s adult content advertising policy, which prohibits nudity, sexually suggestive content, and sexual services. Despite this policy, these ads continue to circulate on Meta’s platforms, sparking criticism from various communities, including sex workers, educators, and LGBTQ individuals, who feel unfairly targeted by Meta’s content policies.

For years, users have criticised Meta for what they perceive as discriminatory enforcement of its community guidelines. LGBTQ and sex educator accounts have reported instances of shadowbanning on Instagram, while WhatsApp has banned accounts associated with sex work. Additionally, Meta’s advertising approval process has come under scrutiny, with reports of gender-biased rejections of ads, such as those for sex toys and period care products. Despite these issues, explicit AI ‘girlfriend’ ads have evaded Meta’s enforcement mechanisms, highlighting a gap in the company’s content moderation efforts.

When approached, Meta acknowledged the presence of these ads and stated its commitment to removing them promptly. A Meta spokesperson emphasised the company’s ongoing efforts to improve its systems for detecting and removing ads that violate its policies. However, despite Meta’s assurances, Wired found that thousands of these ads remained active even days after the initial inquiry.

Meta spokesperson sentenced to six years in Russia

A military court in Moscow has reportedly sentenced Meta Platforms spokesperson Andy Stone to six years in prison in absentia for ‘publicly defending terrorism.’ This ruling comes amid Russia’s crackdown on Meta, which was designated as an extremist organisation in the country, resulting in the banning of Facebook and Instagram in 2022 due to Russia’s conflict with Ukraine.

Meta has yet to comment on the reported sentencing of Stone, who serves as the company’s communications director. Stone himself was unavailable for immediate response following the court’s decision. Stone’s lawyer, Valentina Filippenkova, indicated they intend to appeal the verdict, expressing a request for acquittal.

The Russian interior ministry initiated a criminal investigation against Stone late last year, although the specific charges were not disclosed then. According to state investigators, Stone’s online comments allegedly defended ‘aggressive, hostile, and violent actions’ against Russian soldiers involved in what Russia terms its ‘special military operation’ in Ukraine.

Why does it matter?

Stone’s sentencing underscores Russia’s stringent stance on online content related to its military activities in Ukraine, extending repercussions to individuals associated with Meta Platforms. The circumstances also reflect the broader context of heightened scrutiny and legal actions against perceived dissent and criticism within Russia’s digital landscape.

Meta shifts away from politics ahead of 2024 US election

In a significant shift ahead of the Trump-Biden rematch, Meta is distancing itself from politics after years of positioning as a key player in political discourse. The company has reduced the visibility of political content on Facebook and Instagram, imposed new rules on political advertisers, and downsized the team responsible for engaging with politicians and campaigns. The shift reshaped digital outreach strategies for the 2024 US election and could transform political communication on social media platforms.

Meta’s retreat from politics follows years of controversy and public scrutiny, including outrage over Russian interference in the 2016 presidential race and the role of social media in the 6 January 2021 attack on the US Capitol. The company’s efforts to minimise political content in users’ news feeds reflect a broader trend away from news and politics on social media platforms. This shift has impacted major news outlets, with significant declines in user engagement observed across platforms.

As Meta redefines its approach to political content, political campaigns adapt their strategies to navigate this new landscape. The Biden campaign has increased its social media presence to drive engagement, while Trump has turned to alternative platforms like Truth Social. However, both parties recognise the continued importance of Facebook as a vital tool for reaching voters despite the platform’s evolving restrictions on political advertising and content.

Why does it matter?

The changing dynamics of political communication on social media raise concerns about access to information and the role of tech companies in shaping public discourse. With political content increasingly marginalised on platforms like Facebook and Instagram, questions arise about how voters will stay informed about key issues during elections. As campaigns adjust to Meta’s evolving policies, the impact on democratic discourse and the dissemination of political information remains a topic of debate and scrutiny.

Apple removes WhatsApp and Threads from China app store

Apple has removed the Meta-owned apps WhatsApp and Threads from its app store in China, complying with orders from the country’s internet regulator, the Cyberspace Administration, citing national security concerns. According to Apple, the move was made in accordance with local laws, despite any disagreement. The Chinese government allegedly found content on WhatsApp and Threads regarding China’s president, Xi Jinping, that violated cybersecurity laws, though specifics were unclear.

This action intensifies the technology dispute between the US and China, with Apple and Meta caught in the middle. In the US, lawmakers are considering a bill that would compel ByteDance to divest its popular video app TikTok, citing national security risks due to its ties to China. Meanwhile, the White House is tightening restrictions on Beijing’s access to advanced technologies and American financing.

Apple, reliant on China for a significant portion of its revenue, has complied with Beijing’s demands in the past, including blocking various apps and establishing a data centre to store Chinese users’ iCloud data. As tensions persist, Apple has started diversifying its supply chain, reducing its dependence on Chinese manufacturing.

While Meta’s fallout from China may be less direct, the company faces challenges elsewhere, particularly in its strained relationship with Apple over privacy and data tracking issues. In the US, efforts to address concerns over TikTok’s ownership and data handling are gaining momentum, with legislation being packaged alongside other bills related to foreign aid.