TikTok challenges US law over China ties in court

TikTok has contested claims made by the US Department of Justice in a federal appeals court, asserting that the government has inaccurately characterised the app’s ties to China. The company is challenging a law that mandates its Chinese parent company, ByteDance, to divest TikTok’s US assets by January 19 or face a ban. TikTok argues that the app’s content recommendation engine and user data are securely stored in the US, with content moderation conducted domestically.

The law, signed by President Joe Biden in April, reflects concerns over potential national security risks, with accusations that TikTok allows Chinese authorities to access American data and influence content. TikTok, however, contends that the law infringes on free speech rights, arguing that its content curation should be protected by the US Constitution.

Oral arguments for the case are scheduled for September 16, just before the presidential election on November 5. As the debate heats up, both Republican and Democratic presidential candidates have expressed contrasting views on TikTok, with Donald Trump opposing a ban and Kamala Harris embracing the platform as part of her campaign.

The legislation also impacts app stores and internet hosting services, barring support for TikTok unless it is sold. The swift passage of the measure in Congress highlights ongoing fears regarding data security and espionage risks associated with the app.

Google ordered to allow more app download options after Epic Games victory

A US judge ordered Google to provide Android users with more ways to download apps outside of its Play Store, following a jury decision in favour of ‘Fortnite’ developer Epic Games in an antitrust case. Epic’s lawsuit accused Google of dominating app distribution and in-app payments on Android devices, urging the court to make it easier for users to access third-party app stores. Google countered that such changes would damage competition, as well as consumer privacy and security.

District Judge James Donato criticised Google’s resistance to Epic’s suggestions, signalling that he will deliver a brief ruling focused on enhancing app distribution options for both users and developers. Donato stressed that Google, deemed a monopolist, must address its anticompetitive behaviour and announced the formation of a three-person committee to monitor the enforcement of the injunction.

Why does this matter?

The following case is part of a wider antitrust examination. Google also faces a separate government lawsuit in Washington, DC, concerning its search engine dominance, where another judge recently ruled against the company. Both cases highlight increasing legal challenges to Google’s business practices. Neither Epic Games nor Google commented on the latest developments.

US DOJ considers breaking up Google after antitrust ruling

The US Department of Justice is exploring various options, including potentially breaking up Alphabet’s Google, after a recent court ruling found the tech giant guilty of illegally monopolising the online search market. The ruling was considered a significant victory for federal authorities challenging Big Tech’s dominance, which determined that Google spent billions to establish an illegal monopoly as the world’s default search engine.

Among the remedies the DOJ considers are forcing Google to share data with competitors and implementing safeguards to prevent the company from gaining an unfair advantage in AI products. Discussions have also included the possibility of divesting key assets such as the Android operating system, the AdWords search ad program, and the Chrome web browser.

Why does this matter?

The following case is part of a broader effort by federal antitrust regulators, who have previously taken action against other tech giants like Meta Platforms, Amazon, and Apple, accusing them of maintaining illegal monopolies. Alphabet and the DOJ have not yet commented on the ongoing deliberations.

Texas federal judge steps down from Musk’s X lawsuit

A federal judge in Texas has recused himself from overseeing a lawsuit filed by Elon Musk’s social media platform, X, against a group of advertisers. US District Judge Reed O’Connor, who was assigned to the case, stepped down after reports surfaced that he owned shares in Tesla, another company owned by Musk. The lawsuit, initiated by X, accuses the World Federation of Advertisers of conspiring to boycott the platform, leading to revenue losses.

Judge O’Connor did not provide a specific reason for his recusal, but the move follows concerns about his potential financial interest in Musk’s companies. According to a recent judicial financial disclosure, O’Connor held between $15,001 and $50,000 in Tesla stock. Judges often recuse themselves from cases with a financial stake in one of the parties involved.

The case has been reassigned to US District Judge Ed Kinkeade in Dallas. The Northern District of Texas, where the case was filed, is known as a preferred venue for conservative lawsuits challenging Democratic policies.

Social media Bluesky gains popularity in UK after Musk’s riot remarks

Bluesky, a social media platform, has reported a significant increase in signups in the United Kingdom recently as users look for alternatives to Elon Musk’s X. The increase follows Musk’s controversial remarks on ongoing riots in the UK, which have driven users, including several Members of Parliament, to explore other platforms. The company announced that it had experienced a 60% rise in activity from UK accounts.

Musk has faced criticism for inflaming tensions after riots in Britain were sparked by misinformation surrounding the murder of three girls in northern England. The Tesla CEO allegedly used X to disseminate misleading information to his vast audience, including a post claiming that civil war in Britain was ‘inevitable.’ The case has prompted Prime Minister Keir Starmer to respond and increased calls for the government to accelerate the implementation of online content regulations.

Bluesky highlighted that the UK had the most signups of any country for five of the last seven days. Once supported by Twitter co-founder Jack Dorsey, the platform is among the many apps vying to replace Twitter after Musk’s turbulent takeover in late 2022.

As of July, Bluesky’s monthly active user base was approximately 688,568, which is small compared to X’s 76.9 million users, according to Similarweb, a digital market intelligence firm. Despite its smaller size, the recent surge in UK signups to Bluesky appears to be a growing interest in alternative social media platforms.

Russia blocks Signal messaging app

Russia’s state communications watchdog, Roskomnadzor, has announced a nationwide block on the encrypted messaging app Signal. The restriction, reported by Interfax, is attributed to Signal’s failure to comply with Russian anti-terrorism laws aimed at preventing the use of messaging apps for extremist activities.

Users across Russia, including Moscow and St Petersburg, experienced significant disruptions with Signal, which approximately one million Russians use for secure communications. Complaints about the app surged to over 1,500, indicating widespread issues. While Signal appeared to function normally for some users with a VPN, it was inaccessible for others trying to register new accounts or use it without a VPN.

Mikhail Klimarev, a telecom expert, confirmed that this block represents a deliberate action by Russian authorities rather than a technical malfunction. He noted that this is the first instance of Signal being blocked in Russia, marking a significant escalation in the country’s efforts to control encrypted communication platforms.

Roskomnadzor’s action follows previous attempts to restrict other messaging services, such as Telegram, which faced a similar blocking attempt in 2018. Despite these efforts, Telegram’s availability in Russia remained relatively unaffected. Signal still needs to comment on the current situation.

NOYB files complaint against X over AI data use

An Austrian advocacy group, NOYB, has filed a complaint against the social media platform X, owned by Elon Musk, accusing the company of using users’ data to train its AI systems without their consent. The complaint, led by privacy activist Max Schrems, was lodged with authorities in nine European Union countries, pressuring Ireland’s Data Protection Commission (DPC), the primary EU regulator for major US tech firms because their EU operations are based in Ireland.

The Irish DPC seeks to limit or suspend X’s ability to process user data for AI training. X has agreed to halt the use of personal data for AI training until users can opt-out, following a recent Irish court hearing that found X had begun data collection before allowing users to object.

Despite this fact, NOYB’s complaint primarily focuses on X’s lack of cooperation and the inadequacy of its mitigation measures rather than questioning the legality of the data processing itself. Schrems emphasised the need for X to fully comply with the EU law by obtaining user consent before using their data. X has yet to respond to the latest complaint but intends to work with the DPC on AI-related issues.

In a related case, Meta, Facebook’s parent company, delayed the launch of its AI assistant in Europe after the Irish DPC advised against it, following similar complaints from NOYB regarding using personal data for AI training.

UK considers revising Online Safety Act amid riots

The British government is considering revisions to the Online Safety Act in response to a recent wave of racist riots allegedly fueled by misinformation spread online. The act, passed in October but not yet enforced, currently allows the government to fine social media companies up to 10% of their global turnover if they fail to remove illegal content, such as incitements to violence or hate speech. However, proposed changes could extend these penalties to platforms that permit ‘legal but harmful’ content, like misinformation, to thrive.

Britain’s Labour government inherited the act from the Conservatives, who had spent considerable time adjusting the bill to balance free speech with the need to curb online harms. A recent YouGov poll found that 66% of adults believe social media companies should be held accountable for posts inciting criminal behaviour, and 70% feel these companies are not sufficiently regulated. Additionally, 71% of respondents criticised social media platforms for not doing enough to combat misinformation during the riots.

In response to these concerns, Cabinet Office Minister Nick Thomas-Symonds announced that the government is prepared to revisit the act’s framework to ensure its effectiveness. London Mayor Sadiq Khan also voiced his belief that the law is not ‘fit for purpose’ and called for urgent amendments in light of the recent unrest.

Why does it matter?

The riots, which spread across Britain last week, were triggered by false online claims that the perpetrator of a 29 July knife attack, which killed three young girls, was a Muslim migrant. As tensions escalated, X owner Elon Musk contributed to the chaos by sharing misleading information with his large following, including a statement suggesting that civil war in Britain was ‘inevitable.’ Prime Minister Keir Starmer’s spokesperson condemned these comments, stating there was ‘no justification’ for such rhetoric.

X faces scrutiny for hosting extremist content

Concerns are mounting over content shared by the Palestinian militant group Hamas on X, the social media platform owned by Elon Musk. The Global Internet Forum to Counter Terrorism (GIFCT), which includes major companies like Facebook, Microsoft, and YouTube, is reportedly worried about X’s continued membership and position on its board, fearing it undermines the group’s credibility.

The Sunday Times reported that X has become the most accessible platform to find Hamas propaganda videos, along with content from other UK-proscribed terrorist groups like Hezbollah and Palestinian Islamic Jihad. Researchers were able to locate such videos within minutes on X.

Why does it matter?

These concerns come as X faces criticism for reducing its content moderation capabilities. The GIFCT’s independent advisory committee expressed alarm in its 2023 report, citing significant reductions in online trust and safety measures on specific platforms, implicitly pointing to X.

Elon Musk’s approach to turning X into a ‘free speech’ platform has included reinstating previously banned extremists, allowing paid verification, and cutting much of the moderation team. The shift has raised fears about X’s ability to manage extremist content effectively. Despite being a founding member of GIFCT, X still needs to meet its financial obligations.

Additionally, the criticism Musk faced in Great Britain indicates the complex and currently unsolvable policy governance question: whether to save the freedom of speech or scrutinise in addition the big tech social media owners and focus on community safety?

Ireland takes legal action against X over data privacy

The Irish Data Protection Commission (DPC) has launched legal action against the social media platform X, formerly Twitter, in a case that revolves around processing user data to train Musk’s AI large language model called Grok. The AI tool or chatbot was developed by xAI, a company founded by Elon Musk, and is used as a search assistant for premium users on the platform.

The DPC is seeking a court order to stop or limit the processing of user data by X for training its AI systems, expressing concerns that this could violate the European Union’s General Data Protection Regulation (GDPR). The case may be referred to the European Data Protection Board for further review.

The legal dispute is part of a broader conflict between Big Tech companies and regulators over using personal data to develop AI technologies. Consumer organisations have accused X of breaching GDPR, a claim the company has vehemently denied, calling the DPC’s actions unwarranted and overly broad.

The Irish DPC has an important role in overseeing X’s compliance with the EU data protection laws since the platform’s operations in the EU are managed from Dublin. The current legal proceedings could significantly shift how Ireland enforces GDPR against large tech firms.

The DPC is also concerned about X’s plans to launch a new version of Grok, which is reportedly being trained using data from the EU and European Economic Area users. The privacy watchdog argues that this could worsen existing issues with data processing.

Despite X implementing some mitigation measures, such as offering users an opt-out option, these steps were not in place when the data processing began, leading to further scrutiny from the DPC. X has resisted the DPC’s requests to halt data processing or delay the release of the new Grok version, leading to an ongoing court battle.

The outcome of this case could set a precedent for how AI and data protection issues are handled across Europe.