Alphabet’s YouTube announced its compliance with a court decision to block access to 32 video links in Hong Kong, marking a move critics argue infringes on the city’s freedoms amid tightening security measures. The decision followed a government application granted by Hong Kong’s Court of Appeal, targeting a protest anthem named ‘Glory to Hong Kong,’ with judges cautioning against its potential use by dissidents to incite secession.
Expressing disappointment, YouTube stated it would abide by the removal order while highlighting concerns regarding the chilling effect on online free expression. Observers, including the US government, voiced worries over the ban’s impact on Hong Kong’s reputation as a financial hub committed to the free flow of information.
Industry groups emphasised the importance of maintaining a free and open internet in Hong Kong, citing its significance in preserving the city’s competitive edge. The move reflects broader trends of tech companies complying with legal requirements, with Google parent Alphabet having previously restricted content in China.
Why does it matter?
Despite YouTube’s action, tensions persist over the erosion of freedoms in Hong Kong, underscored by ongoing international scrutiny and criticism of the city’s security crackdown on dissent. As the city grapples with balancing national security concerns and its promised autonomy under the ‘one country, two systems’ framework, the implications for its future as a global business centre remain uncertain.
A group of TikTok creators has taken legal action against the US federal government over a law signed by President Joe Biden. The law would either require the divestiture of the popular short video app or potentially ban it altogether. TikTok creators argue that the app has become integral to American life, with 170 million users nationwide.
Among those suing are individuals from diverse backgrounds and professions, including a Marine Corps veteran, a woman selling cookies, a college coach, a hip-hop artist, and an advocate for sexual assault survivors. Despite their differences, they all believe TikTok provides a unique platform for self-expression and community-building.
The lawsuit, filed by Davis Wright Tremaine LLP on behalf of the creators, alleges that the law infringes on free speech rights and threatens to eliminate an important communication medium. The White House has refrained from commenting on the matter, while the US Department of Justice asserts that the law addresses national security concerns while remaining within constitutional boundaries.
Why does it matter?
The ongoing legal battle echoes past disputes involving TikTok, including a similar lawsuit filed by the company and its parent company, ByteDance. Courts have previously intervened to block attempts to ban the app, citing concerns about free speech and constitutional rights.
The Delhi High Court has directed Google and Microsoft to file a review petition seeking the recall of a previous order mandating search engines to promptly restrict access to non-consensual intimate images (NCII) without necessitating victims to provide specific URLs repeatedly. Both tech giants argued the technological infeasibility of identifying and proactively taking down NCII images, even with the assistance of AI tools.
The court’s order stems from a 2023 ruling requiring search engines to remove NCII within 24 hours, as per the IT Rules, 2021, or risk losing their safe harbour protections under Section 79 of the IT Act, 2000. It proposed issuing a unique token upon initial takedown, with search engines responsible for turning off any resurfaced content using pre-existing technology to alleviate the burden on victims of tracking and repeatedly reporting specific URLs. Moreover, the court suggested leveraging hash-matching technology and developing a ‘trusted third-party encrypted platform’ for victims to register NCII content or URLs, shifting the responsibility of identifying and removing resurfaced content away from victims and onto the platform while ensuring utmost transparency and accountability standards.
However, Google expressed concerns regarding automated tools’ inability to discern consent in shared sexual content, potentially leading to unintended takedowns and infringing on free speech, echoing Microsoft’s apprehension about the implications of proactive monitoring on privacy and freedom of expression.
An Australian court has denied the cyber safety regulator’s attempt to extend an order for Elon Musk’s X to block videos depicting the stabbing of an Assyrian church bishop, labelled as a terrorist attack. The Federal Court judge, Geoffrey Kennett, rejected the bid to prolong the injunction, with reasons for the decision to be disclosed later.
The legal clash has fueled tensions between Musk and senior figures in Australia, including Prime Minister Anthony Albanese, who criticised Musk as ‘an arrogant billionaire’ for resisting the video’s takedown. Musk responded by posting memes, condemning the regulatory order as censorship. While other platforms like Meta swiftly removed the content upon request, X has been persistent in its refusal to remove the posts globally, arguing against one country’s rules dictating internet content.
Last month, the Federal Court upheld the eSafety Commissioner’s order for X to remove 65 posts containing the violent footage of the bishop’s stabbing during a sermon in Sydney. The incident, for which a 16-year-old boy has been charged with a terrorism offence, prompted Australia to block local access to the posts. However, the regulator contested X’s proposal to geo-block Australians, claiming it was ineffective due to the widespread use of virtual private networks to conceal users’ locations.
In response to the rising concerns over social media influence, Albanese’s government has announced plans for a parliamentary inquiry to investigate the adverse effects of online platforms. The inquiry aims to address the control social media exerts over Australians’ online content consumption, highlighting a lack of oversight.
Australia is taking stringent measures by announcing a parliamentary inquiry into the impact of social media platforms. The legal step is a response to the growing concerns over their influence on public discourse and the alarming spread of harmful content. Prime Minister Anthony Albanese, in his address, underscored the need for greater scrutiny, acknowledging that while social media can be a force for good, it also wields an impactful negative influence, particularly on issues as grave as domestic violence and radicalisation.
The government’s move comes amid criticism of platforms like Meta’s Facebook, ByteDance’s TikTok, and Elon Musk’s X for handling violent posts and content moderation. X, in particular, is embroiled in a legal dispute with the Australian government over its refusal to globally remove videos of a recent stabbing attack on an Assyrian church bishop in Sydney. The government argues for broader content removal, while Musk has characterised the decision as censorship.
The inquiry will also examine Meta’s decision to stop paying for news content in Australia, reflecting broader concerns about the role of social media in shaping public discourse and its impact on traditional media. Communications Minister Michelle Rowland stressed the importance of understanding how social media companies regulate content and called for greater accountability in their decision-making processes.
As Parliament gears up for the inquiry, the terms and scope are still being determined. The aim is to scrutinise the practices of social media companies and make recommendations for accountability measures. The inquiry may involve summoning individuals to testify, a move that underscores the government’s commitment to addressing concerns surrounding social media regulation and content moderation. The outcomes of this inquiry will be crucial in shaping the future of social media regulation, making it a process of utmost relevance and impact.
Representing the regulator, Tim Begbie emphasised that while X has policies to remove harmful content, it shouldn’t override Australian law. He criticised X’s stance, stating that refusal to remove content globally affects the definition of ‘reasonable’ within Australia’s Online Safety Act. Despite X’s geo-blocking attempts, Begbie argued it’s ineffective due to VPN usage.
Bret Walker, X’s lawyer, defended the company’s actions, stressing the need for global access to newsworthy content. He expressed concern over restricting global access based on Australian laws and emphasised the importance of allowing individuals to form their own opinions.
The Federal Court of Australia has extended a temporary takedown order on the posts until 10 June, delaying a final decision. The case underscores the debate over internet regulation and free speech, with implications for global content moderation and national sovereignty.
TikTok has filed a lawsuit against the US government, challenging a new law that requires the app to sever ties with its Chinese parent company, ByteDance, or face a ban in the US. The company argues that the law is unconstitutional and deems it impossible to sell the app from ByteDance, stating that it would instead result in a shutdown by 19 January 2025.
Namely, the law, signed by President Joe Biden last month, grants ByteDance nine months to divest TikTok or cease its operations in the US, citing national security concerns. However, TikTok’s complaint argues that the government has not presented sufficient evidence of the Chinese government misusing the app. Concerns expressed by individual members of Congress and a congressional committee report are speculative about the potential misuse of TikTok in the future without citing specific instances of misconduct. However, TikTok asserts that it has operated prominently in the US since its launch in 2017.
The app contends that a ban in the US would be unfeasible due to the complex task of transferring millions of lines of software code from ByteDance to a new owner. Additionally, restrictions imposed by the Chinese government would prevent the sale of TikTok along with its algorithm. TikTok argues that such a ban would effectively isolate American users and undermine its business, mentioning also its previous efforts to address US government concerns.
During the Trump administration, discussions were held regarding partnerships with American companies such as Walmart, Microsoft, and Oracle to separate TikTok’s US operations. However, these potential deals have yet to materialise. TikTok also attempted to appease the government by storing US user data in Oracle’s servers, although a recent report suggests that this action was primarily cosmetic.
TikTok seeks a court judgement to declare the Biden administration’s legislation unconstitutional in response to the new law. The company also requests an order to prevent the attorney general from enforcing the law.
Ukraine’s military intelligence agency, GUR, revealed that Telegram had blocked multiple official bots critical of Russia’s military actions in Ukraine. In a statement on the messaging platform, GUR expressed dissatisfaction with the decision, citing the significance of these bots in opposing Russian aggression. Despite the blockade, GUR assured users of the safety of their personal data. Telegram’s press service remained silent in response to inquiries, highlighting the platform’s role as a vital hub for accessing unfiltered information since Russia invaded Ukraine in 2022.
In response to Ukraine’s concerns, Telegram announced restoring access to several chatbots used by Ukraine’s security agencies to collect information about Russia’s war effort. Initially reported by GUR, the suspension prompted concerns regarding censorship amid the ongoing conflict. Telegram bots, automated features enabling users to submit or request information, play a vital role in Ukraine’s response to Russian aggression, facilitating the reporting of Russian military activities within Ukraine.
Although the specific reason for the bots’ temporary suspension remains undisclosed, a Telegram spokesperson attributed it to a ‘false positive.’ Despite occasional disruptions, Telegram remains a primary source of unfiltered information for users in Ukraine and Russia, with President Zelenskiy utilising it for daily video addresses and the armed forces employing it to alert Ukrainians of impending airstrikes and document battlefield developments.
Elon Musk’s feud with Australian authorities reached new heights as he advocated for the imprisonment of a senator and criticised the country’s gun laws in the wake of a court order targeting his platform, X. The dispute stemmed from X’s publication of a video depicting a knife attack on an Assyrian bishop during a church service in Sydney, prompting the federal court to temporarily halt the video’s display.
Our concern is that if ANY country is allowed to censor content for ALL countries, which is what the Australian “eSafety Commissar” is demanding, then what is to stop any country from controlling the entire Internet?
In response to the court order, Musk accused Australian leaders of attempting to censor the internet, sparking condemnation from lawmakers and prompting Senator Jacqui Lambie to delete her X account in protest. Lambie called for Musk’s imprisonment, labelling him as ‘lacking a social conscience’. Musk, in turn, labelled Lambie as an ‘enemy of the people of Australia.’
Musk’s combative approach towards governments extends beyond Australia, as seen in his clashes with authorities in Brazil over social media content oversight. He further escalated tensions by endorsing posts criticising Australia’s gun laws and government, reacting with exclamation marks and amplifying messages questioning the integrity of Australian governance.
The legal battle between Musk’s platform and Australian authorities intensified during a court hearing, where X was accused of failing to fully comply with the temporary takedown order. Despite claims of compliance, the video remained accessible on X in Australia. The federal court judge extended the temporary takedown order until further hearings, citing the need for continued deliberation over the contentious issue.
The Senate has passed a foreign aid package that includes a bill mandating China-based company ByteDance to sell TikTok within a year or face a US ban on the platform. Having cleared both chambers of Congress, the legislation is now headed to President Joe Biden, who has committed to signing it into law. ByteDance will have an initial nine months to finalise a sale, with a possible three-month extension based on progress, though legal challenges could delay enforcement.
The bill’s successful passage through the Senate was achieved through strategic manoeuvring in the House, where it was included in a high-priority foreign aid package. This move compelled the Senate to address the TikTok issue earlier than anticipated. By extending the divestment timeline, more support was garnered in the Senate, resulting in a vote of 79-18 in favour of the bill.
Lawmakers and intelligence officials have voiced concerns over TikTok’s ownership by a China-based company. They cite potential data security risks due to China’s national security law and fear that the Chinese government’s influence could impact US user experiences.
Senate Commerce Committee Chair Maria Cantwell stressed that the legislation aims to prevent foreign adversaries from conducting espionage and harming vulnerable Americans, not to punish specific companies.
Senate Intelligence Committee Chair Mark Warner highlighted worries about Chinese companies owing allegiance to the Chinese government and potential covert manipulation of social media platforms. He dismissed TikTok’s proposed data governance solution, Project Texas, as inadequate. Despite concerns among TikTok users, Warner assured that the legislation is not about silencing voices but addressing critical national security issues.
President Biden has expressed intent to promptly sign the bill into law to facilitate aid to Ukraine, while TikTok has signalled readiness to challenge the law in court if passed.