Japan cracks down on tech giants and social media scams with new legislation

Japan’s cabinet has approved a legislative proposal to curb the dominance of tech giants like Google and Apple by imposing major fines on those that restrict third-party access to smartphone apps and payment systems. This initiative mirrors the EU’s Digital Markets Act (DMA) and targets anti-competitive behaviour, threatening fines of up to 20% of their revenues.

According to government spokesman Yoshimasa Hayashi, this regulation is crucial for maintaining a competitive digital environment internationally and fostering innovation and consumer choice in software necessary for smartphone usage.

Concurrently, the Japanese government is escalating efforts to regulate social media platforms like Meta, addressing the surge in online scams that exploit celebrity images. The local unit of Meta has been criticised for its inadequate response to online scams that exploit celebrity images to commit fraud. Former Digital Transformation Minister Takuya Hirai has suggested that the Japanese Diet might summon Meta CEO Mark Zuckerberg to testify. He criticised Meta for being the ‘most non-compliant platform owner’ and highlighted the company’s insufficient efforts in combating these fraudulent activities.

Why does it matter?

Meanwhile, the Japanese government is intensifying its oversight of large US tech firms, with the Liberal Democratic Party working on policy options to better regulate these companies. The initiative is part of a broader global effort by regulators to control the influence of major tech corporations, as evidenced by Japan’s Fair Trade Commission’s antitrust actions against Google and recent legislative measures endorsed by Prime Minister Fumio Kishida to curb anti-competitive practices in the tech industry.

UNESCO report reveals technology’s mixed impact on girls’ education

A new UN report, released by the UNESCO latest Global Education Monitor (GEM), explores how technology affects girls’ education from a gender perspective.

The report celebrates two decades of reduced discrimination against girls but also notes technology’s negative effects on their educational outcomes. It addresses challenges such as online harassment, access disparities in ICT, and the harmful influences of social media on mental health and body image, which can impede academic performance. Additionally, the report sheds light on the gender gap in STEM fields, underscoring the underrepresentation of women in STEM education and careers.

While highlighting that appropriately used social media can enhance girls’ awareness and knowledge of social issues, the GEM team also calls for increased educational investment and stricter digital regulations to promote safer, more inclusive environments for girls worldwide.

Why does it matter?

The report coincided with the International Girls in ICT Day, supported by the ITU, during which the UN Secretary-General emphasised the need for greater support and resources for girls in Information and Communication Technology (ICT), noting that globally, women (65%) have less access to the internet compared to men (70%). The persistent access gap in ICT and its disproportionately adverse effects on girls, despite years of acknowledgement, suggests a need for a more aggressive approach in policy and resource allocation to truly level the playing field.

Pakistan blocks social media platform X for national security reasons

Pakistan’s interior ministry confirmed that it had blocked access to the social media platform X (formerly Twitter) around February’s national election due to national security concerns. Despite reports from users experiencing difficulties accessing X since mid-February, the government has not officially acknowledged the shutdown. The interior ministry made this revelation in a written submission to the Islamabad High Court, responding to a petitioner’s plea challenging the ban.

The ministry cited X’s alleged failure to comply with lawful directives and address concerns regarding platform misuse as reasons for imposing the ban. According to the ministry, X was reluctant to resolve these issues, prompting the government’s decision to uphold national security, maintain public order, and preserve the country’s integrity.

Why does it matter?

The temporary ban on X coincided with the 8 February national election, contested by the party that jailed former prime minister Imran Khan, alleging rigging. Khan’s party heavily relies on social media platforms for communication, especially after facing censorship by traditional media ahead of the polls. Khan, with over 20 million followers on X, remains prominent despite being incarcerated on multiple convictions preceding the election.

The decision to block X was based on confidential reports from Pakistan’s intelligence and security agencies, which indicated nefarious intentions by hostile elements on the platform to create chaos and instability in the country. This move has raised concerns among rights groups and marketing advertisers, with activists arguing that such restrictions hinder democratic accountability and access to real-time information crucial for public discourse and transparency. Marketing consultants also highlight challenges in convincing Pakistani advertisers to use X for brand communications due to governmental restrictions on the platform.

Mark Zuckerberg wins dismissal in lawsuits over social media harm to children

Meta CEO Mark Zuckerberg has secured the dismissal of certain claims in multiple lawsuits alleging that Facebook and Instagram concealed the harmful effects of their platforms on children. US District Judge Yvonne Gonzalez Rogers in Oakland, California, ruled in favour of Zuckerberg, dismissing claims from 25 cases that sought to hold him personally liable for misleading the public about platform safety.

The lawsuits, part of a broader litigation by children against social media giants like Meta, assert that Zuckerberg’s prominent role and public stature required him to fully disclose the risks posed by Meta’s products to children. However, Judge Rogers rejected this argument, stating it would establish an unprecedented duty to disclose for any public figure.

Despite dismissing claims against Zuckerberg, Meta remains a defendant in the ongoing litigation involving hundreds of lawsuits filed by individual children against Meta and other social media companies like Google, TikTok, and Snapchat. These lawsuits allege that social media use led to physical, mental, and emotional harm among children, including anxiety, depression, and suicide. The plaintiffs seek damages and a cessation of harmful practices by these tech companies.

Why does it matter?

The lawsuits highlight a broader concern about social media’s impact on young users, prompting legal action from states and school districts. Meta and other defendants deny wrongdoing and have emphasised their commitment to addressing these concerns. While some claims against Zuckerberg have been dismissed, the litigation against Meta and other social media giants continues as plaintiffs seek accountability and changes to practices allegedly detrimental to children’s well-being.

The ruling underscores the complex legal landscape surrounding social media platforms and their responsibilities regarding user safety, particularly among younger demographics. The outcome of these lawsuits could have significant implications for the regulation and oversight of social media companies as they navigate concerns related to their platforms’ impact on mental health and well-being.

Indian advertisement regulators issues guidelines for heath and financial influencers

The Advertising Standards Council of India (ASCI), a regulatory organization of the advertising industry in India, has issued additional requirements in its influencer advertising guidelines, requiring that influencers in the banking, financial services, and insurance (BFSI) sector, commonly referred to as ‘finfluencers‘, must be registered with the Securities and Exchange Board of India (SEBI), the regulatory body for securities and commodity market in India to provide investment-related advice.

According to the new guidelines, influencers endorsing health and nutrition products and making claims, must hold relevant qualifications such as medical degrees or certifications in areas such as nursing, nutrition, dietetics, physiotherapy, or psychology, depending on the type of advice provided. These qualifications should be prominently disclosed.

For ‘finfluencers’, their SEBI registration number should be prominently displayed alongside their qualifications. For other financial advice, they are required to have appropriate credentials. In addition, compliance with disclosure prerequisites outlined by financial sector regulators is expected.

Violations of these guidelines may lead to penalties under the Consumer Protection Act 2019 and other relevant provisions of the law.

European consumer organization files complaint against social media platforms over misleading promotion

The European Consumer Organization (BEUC) filed a complaint with the European Commission and consumer authorities against Instagram, YouTube, TikTok, and Twitter. It alleges that these platforms facilitate the ‘misleading’ promotion of cryptocurrencies. In BEUC’s report, titled ‘Hype or harm? The great social media crypto con’ accused social media platforms of using ads and influencers to promote cryptocurrencies, which they deemed an ‘unfair commercial practice’ that puts investors at risk of losses.

Emphasizing the importance of enforcing strict rules on digital asset advertising on social media and preventing influencers from misleading the public, the organization urged the Consumer Protection Cooperation Network to take action in this regard. BEUC also wants the European Commission to assess consumer protection on social media and promote cooperation between consumer groups and the European Supervisory Authorities to protect consumers from misleading promotions of digital assets.

Additionally, BEUC stated that neither the EU’s Markets in Crypto-Assets (MiCA) nor the Digital Services Act (DSA) offers sufficient consumer protection. The complaint was filed with the support of consumer groups in several European countries.

China’s cyberspace regulator cracks down on fake social media accounts

China’s cyberspace regulator has recently closed over 100,000 online accounts that were spreading false news and rumours about news anchors and media agencies to ‘clean up the internet’. The Cyberspace Administration of China (CAC) has started a campaign to sieve out social media accounts that pretend to be state-controlled media and spread ‘fake news’.

The CAC discovered accounts that utilised AI to imitate ‘authoritative’ news media and create fake news studios and presenters to mislead the public. The regulator has deleted 107,000 accounts and 835,000 pieces of fake news from counterfeit news units and anchors since 6 April.

The regulator asserted that it would provide guidance to online platforms in order to protect the legitimate rights and interests of the majority of internet users to get accurate news. The regulator also encourages users to report fake news and news anchors by providing leads.

The government implements strict measures to remove internet content and language that it considers unsuitable, offensive, or harmful to the public and businesses. Such a crackdown can be contextualised as a larger part of a global effort to combat the spread of fake news, with many countries implementing laws to punish those who spread it.

British Army’s social media accounts were hacked

British Army’s Twitter and YouTube accounts were hacked. The name of the Army’s Twitter account was changed, while videos on cryptocurrency, and posts related to NFTs appeared on their feed. The British Army stated there is no evidence as to who may be behind the hacking of the accounts. The accounts were restored to normal while investigations regarding the hacks are still ongoing. Army’s spokesperson stated that there will not be any further comments on the incident until the investigation is complete.

Coalition of 45 rights organisations decries new Bangladesh draft rules

Forty-five international organisations signed a communication to the Bangladesh Telecommunication Regulatory Commission (BTRC) urging them to withdraw or reconsider proposed regulations for digital media, social media, and over-the-top (OTT) platforms. The draft is dated October 2021 but was published on 6 February 2022. Stating that ‘The Draft Regulations seek to implement a content governance framework devoid of adequate judicial oversight, clarity and predictability, and integration of human rights and due process,’ the letter details eleven initial key concerns that require discussion.