Israeli spyware deal reports denied by US and Israel

Officials from the United States and Israel have refuted claims of approving the sale of Israeli spyware firm Paragon to Florida-based AE Industrial Partners. Reports of the transaction surfaced in Israeli media, suggesting both governments had greenlit the deal, but US and Israeli representatives dismissed these assertions.

The White House clarified that the sale was a private transaction with no formal US approval, while Israel‘s Defence Ministry stated it was still evaluating the deal. Paragon, linked to former Israeli intelligence officers, has faced scrutiny in the US market, including a paused $2 million contract with ICE.

The alleged acquisition has drawn attention due to Paragon’s ties to national security and controversial surveillance software. Both AE and Paragon have not yet commented on the situation.

Trump signals support for TikTok amid national security debate

President-elect Donald Trump hinted at allowing TikTok to continue operating in the US, at least temporarily, citing the platform’s significant role in his presidential campaign. Speaking to conservative supporters in Phoenix, Arizona, Trump shared that his campaign content had garnered billions of views on TikTok, describing it as a “beautiful” success that made him reconsider the app’s future.

TikTok’s parent company, ByteDance, has faced pressure from US lawmakers to divest the app over national security concerns, with allegations that Chinese control of TikTok poses risks to American data. The US Supreme Court is set to decide on the matter, as ByteDance challenges a law that could force divestment. Without a favourable ruling or compliance with the law, TikTok could face a US ban by January 19, just before Trump takes office.

Trump’s openness to TikTok contrasts with bipartisan support for stricter measures against the app. While the Justice Department argues that Chinese ties to TikTok remain a security threat, TikTok counters that its user data and operations are managed within the US, with storage handled by Oracle and moderation decisions made domestically. Despite ongoing legal battles, Trump’s remarks and a recent meeting with TikTok’s CEO suggest he sees potential in maintaining the platform’s presence in the US market.

Senators push Biden to extend TikTok sale deadline amid legal uncertainty

Democratic Senator Ed Markey and Republican Senator Rand Paul are urging President Joe Biden to extend the January 19 deadline for ByteDance, the China-based owner of TikTok, to sell the app’s US assets or face a nationwide ban. The Supreme Court is set to hear arguments on January 10 regarding ByteDance’s legal challenge, which claims the law mandating the sale violates First Amendment free speech rights. In their letter to Biden, the senators highlighted the potential consequences for free expression and the uncertain future of the law.

The controversial legislation, signed by Biden in April, was passed due to national security concerns. The Justice Department asserts that TikTok’s vast data on 170 million American users poses significant risks, including potential manipulation of content. TikTok, however, denies posing any threat to US security.

The debate has split lawmakers. Senate Minority Leader Mitch McConnell supports enforcing the deadline, while President-elect Donald Trump has softened his stance, expressing support for TikTok and suggesting he would review the situation. The deadline falls just a day before Trump is set to take office on January 20, adding to the uncertainty surrounding the app’s fate.

US Supreme Court to hear TikTok’s bid to block ban

The US Supreme Court has agreed to review a case involving TikTok and its Chinese parent company, ByteDance, in a challenge against a law requiring the app’s sale or a ban in the US by January 19. The court will hear arguments on 10 January but has not yet decided on TikTok’s request to block the law, which it claims violates free speech rights under the First Amendment. TikTok, used by 170 million Americans, argues the law would harm its operations and user base, while US officials cite national security concerns over data access and content manipulation.

The Justice Department has labelled TikTok a significant security risk due to its Chinese ownership, while TikTok denies posing any threat and accuses lawmakers of speculation. The law, passed in April and signed by President Biden, would ban the app unless ByteDance divests its ownership. The company warns that even a temporary shutdown could damage its US market share, advertising revenue, and ability to recruit creators and staff.

The case also reflects heightened tensions between the US and China over technology and trade policies. TikTok’s fate could set a precedent for the treatment of other foreign-owned apps, raising questions about free speech and digital commerce. The Supreme Court’s decision may have far-reaching implications for the platform’s future and US-China relations.

Protecting journalists online with global solutions from IGF 2024

The safety of journalists online took centre stage during an open forum at IGF 2024 in Riyadh. Experts and audience members shared insights on the growing threats faced by journalists globally, including online harassment, surveillance, and censorship. Discussions underscored how these challenges disproportionately affect women journalists and individuals from marginalised communities.

Panelists such as Isabelle Lois from Switzerland and Bruna Martins dos Santos from Brazil emphasised the urgent need for stronger legal frameworks and better implementation of existing laws. Digital platforms were urged to increase accountability for online attacks, while media organisations were encouraged to provide robust support systems for their journalists. Gulalai Khan from Pakistan highlighted the importance of digital literacy and ethical reporting in navigating online threats.

Debates also addressed the evolving definition of journalism in the digital age, questioning whether protections should extend to citizen journalists and content creators. Giulia Lucchese from the Council of Europe pointed to positive initiatives like Switzerland’s National Action Plan and European campaigns on journalist safety as steps in the right direction. However, participants agreed on the need for greater international collaboration to amplify these efforts.

The session concluded with a call for multi-stakeholder approaches to foster trust and ensure journalist safety. Speakers stressed that governments, tech companies, and civil society must work together to protect press freedom in democratic societies. Overall, the forum highlighted both ongoing challenges and the importance of collective action to safeguard journalists in an increasingly digital world.

All transcripts from the Internet Governance Forum sessions can be found on dig.watch.

Tackling internet fragmentation: A global challenge at IGF 2024

At the Internet Governance Forum (IGF) 2024 in Riyadh, the main session ‘Policy Network on Internet Fragmentation’ delved into implementing Article 29C of the Global Digital Compact (GDC), which seeks to prevent internet fragmentation. A diverse panel comprising government officials, technical experts, and civil society representatives highlighted the multifaceted nature of this issue and proposed actionable strategies to address it.

The scope of internet fragmentation

Panellists underscored that internet fragmentation manifests on technical, governance, and user experience levels. While the global network of over 70,000 systems remains technically unified, fragmentation is evident in user experiences. Anriette Esterhuysen from the Association for Progressive Communications pointed out, ‘How you view the internet as fragmented or not depends on whose internet you think it is.’ She stressed that billions face access and content restrictions, fragmenting their digital experience.

Gbenga Sesan of Paradigm Initiative echoed this concern, noting that fragmentation undermines the goal of universal connectivity by 2030. The tension between a seamless technical infrastructure and fractured user realities loomed large in the discussion.

Operationalising the GDC commitment

Alisa Heaver from the Dutch Ministry of Economic Affairs and Climate highlighted the critical role of Article 29C as a blueprint for preventing fragmentation. She called for a measurable framework to track progress by the GDC’s 2027 review, emphasising that research on the economic impacts of fragmentation must be prioritised. ‘We need to start measuring internet fragmentation now more than ever,’ Heaver urged.

Strategies for collaboration and progress

Multistakeholder cooperation emerged as a cornerstone for addressing fragmentation. Wim Degezelle, a consultant with the IGF Secretariat, presented the Policy Network on Internet Fragmentation (PNIF) framework, while Amitabh Singhal of ICANN highlighted the IGF’s unique position in bridging technical and policy divides. Singhal also pointed to the potential renewal of the IGF’s mandate as pivotal in continuing these essential discussions.

The session emphasised inclusivity in technical standard-setting processes, with Sesan advocating for civil society’s role and audience members calling for stronger private sector engagement. Sheetal Kumar, co-facilitator of the session, stressed the importance of leveraging national and regional IGFs to foster localised dialogues on fragmentation.

Next steps and future outlook

The panel identified key actions, including developing measurable frameworks, conducting economic research, and utilising national and regional IGFs to sustain discussions. The upcoming IGF in 2025 was flagged as a milestone for assessing progress. Despite the issue’s complexity, the panellists were united in their commitment to fostering a more inclusive and seamless internet.

As Esterhuysen aptly summarised, addressing internet fragmentation requires a concerted effort to view the digital landscape through diverse lenses. This session reaffirmed that preventing fragmentation is not just a technical challenge but a deeply human one, demanding collaboration, research, and sustained dialogue.

All transcripts from the Internet Governance Forum sessions can be found on dig.watch.

Experts at IGF 2024 address the dual role of AI in elections, emphasising empowerment and challenges

At IGF 2024, panellists explored AI’s role in elections, its potential for both empowerment and disruption, and the challenges it poses to democratic processes. Moderator Tapani Tarvainen led the discussion with contributions from Ayobangira Safari Nshuti, Roxana Radu, Babu Ram Aryal, and other experts.

Speakers noted that AI had been primarily used for self-promotion in campaigns, helping smaller candidates compete with limited resources. Roxana Radu highlighted AI’s positive role in voter outreach in India but warned of risks such as disinformation and public opinion manipulation. Ayobangira Safari Nshuti pointed to algorithmic biases and transparency issues in platforms as critical concerns, emphasising a recent case in Romania where AI-enabled manipulation caused election disruption.

Accountability of social media platforms became a focal point. Platforms increasingly rely on AI for content moderation, but their effectiveness in languages with limited online presence remains inadequate. Babu Ram Aryal stressed the need for stronger oversight, particularly in multilingual nations, while Dennis Redeker underscored the challenges of balancing regulation with free speech.

Panellists called for holistic solutions to safeguard democracy. Suggestions included enhancing platform transparency, implementing robust digital literacy programmes, and addressing social factors like poverty that exacerbate misinformation. Nana, an AI ethics specialist, advocated for proactive governance to adapt electoral institutions to technological realities.

The session concluded with a recognition that AI’s role in elections will continue to evolve. Panellists urged collaborative efforts between governments, civil society, and technology companies to ensure election integrity and maintain public trust in democratic systems.

Human rights concerns over UN Cybercrime Treaty raised at IGF 2024

A panel discussion at the Internet Governance Forum (IGF) raised serious concerns over the UN Cybercrime Treaty and its potential to undermine human rights. Experts from organisations such as Human Rights Watch and the Electronic Frontier Foundation criticised the treaty’s broad scope and lack of clear safeguards for individual freedoms. They warned that the treaty’s vague language, particularly around what constitutes a ‘serious crime,’ could empower authoritarian regimes to exploit its provisions for surveillance and repress dissent.

Speakers such as Joey Shea from Human Rights Watch and Lina al-Hathloul, a Saudi human rights defender, pointed out the risks posed by the treaty’s expansive investigative powers, which extend beyond cybercrimes to any crimes defined by domestic law. Flexibility like this one could force countries to assist in prosecuting acts that are not crimes within their own borders. They also highlighted the treaty’s weak privacy protections, which could jeopardise encryption standards and further harm cybersecurity researchers.

Deborah Brown from Human Rights Watch and Veridiana Alimonti of the Electronic Frontier Foundation shared examples from Saudi Arabia and Latin America, where existing cybercrime and anti-terrorism laws have already been used to target journalists and activists. The panelists expressed concern that the treaty could exacerbate these abuses globally, especially for cybersecurity professionals and civil society.

Fionnuala Ni Aolain, a former UN Special Rapporteur on counterterrorism and human rights, emphasised that the treaty’s provisions could lead to criminalising the vital work of cybersecurity researchers. She joined other experts in urging policymakers and industry leaders to resist ratification in its current form. They called for upcoming protocol negotiations to address these human rights gaps and for greater involvement of civil society voices to prevent the treaty from becoming a tool for transnational repression.

IGF 2024 addresses cybercrime laws in Africa and the Middle East

Discussions at the IGF 2024 in Riyadh shed light on growing challenges to freedom of expression in Africa and the Middle East. Experts from diverse organisations highlighted how restrictive cybercrime legislation and content regulation have been used to silence dissent, marginalise communities, and undermine democracy. Examples from Tunisia and Nigeria revealed how critics and activists often face criminalisation under these laws, fostering fear and self-censorship.

Panellists included Annelies Riezebos from the Dutch Ministry of Foreign Affairs, Jacqueline Rowe of the University of Edinburgh, Adeboye Adegoke from Paradigm Initiative, and Aymen Zaghdoudi of AccessNow. They discussed the negative effects of vague cybercrime regulations and overly broad restrictions on online speech, which frequently suppress political discourse. Maria Paz Canales from Global Partners Digital added that content governance frameworks need urgent reform to balance addressing online harms with protecting fundamental rights.

The speakers emphasised that authoritarian values are being enforced through legislation that criminalises disinformation and imposes ambiguous rules on online platforms. These measures, they argued, contribute to a deteriorating climate for free expression across the region. They also pointed out the need for online platforms to adopt responsible content moderation practices while resisting pressures to conform to repressive local laws.

Panellists proposed several strategies to counter these trends, including engaging with parliamentarians, building capacity among legal professionals, and ensuring civil society’s involvement during the early stages of policy development. The importance of international collaboration was underlined, with the UN Cybercrime Treaty cited as a key opportunity for collective advocacy against repressive measures.

Participants also stressed the urgency of increased representation of Global South organisations in global policy discussions. Flexible funding for civil society initiatives was described as essential for supporting grassroots efforts to defend digital rights. Such funding would enable local groups to challenge restrictive laws effectively and amplify their voices in international debates.

The event concluded with a call for multi-stakeholder approaches to internet governance. Collaborative efforts involving governments, civil society, and online platforms were deemed critical to safeguarding freedom of expression. The discussions underscored the pressing need to balance addressing legitimate online harms with protecting democratic values and the voices of vulnerable communities.

All transcripts from the Internet Governance Forum sessions can be found on dig.watch.

International experts converge at IGF 2024 to promote digital solidarity in global governance

A panel of international experts at the IGF 2024 gathered to discuss the growing importance of digital solidarity in global digital governance. Jennifer Bachus of the US State Department introduced the concept as a framework for fostering international cooperation centred on human rights and multi-stakeholder engagement. Nashilongo Gervasius, a public interest technology expert from Namibia, highlighted the need to close digital divides and promote inclusivity in global digital policymaking.

The discussion focused on balancing digital sovereignty with the need for international collaboration. Jason Pielemeier, Executive Director of the Global Network Initiative, stressed the critical role of data privacy and cybersecurity in advancing global digital rights. Robert Opp, Chief Digital Officer at the United Nations Development Programme, emphasised the importance of capacity building and enhancing digital infrastructure, particularly in developing nations.

Key global mechanisms like the Internet Governance Forum (IGF) and the World Summit on the Information Society (WSIS) processes featured prominently in the dialogue. Panellists, including Susan Mwape from Zambia, underscored the need to strengthen these platforms while ensuring they remain inclusive and respectful of human rights. The upcoming WSIS+20 review was recognised as an opportunity to revitalise international cooperation in the digital realm.

Challenges such as internet shutdowns, mass surveillance, and the misuse of cybercrime legislation were debated. Mwape voiced concerns about the potential for international forums to lose credibility if hosted by countries with poor human rights records. Audience member Barbara from Nepal called for greater accountability in digital governance practices, while Hala Rasheed from the Alnahda Society echoed the urgency of addressing inequalities in digital policy implementation.

Russian civil society representative Alexander Savnin brought attention to the impact of sanctions on international technical cooperation in cybersecurity. He argued for a more balanced approach that would allow global stakeholders to address shared security challenges effectively. Panellists agreed that fostering trust among diverse actors remains a critical hurdle to achieving digital solidarity.

The session concluded with a commitment to fostering continuous dialogue and collaboration. Panellists expressed hope that inclusive and rights-based approaches could transform digital solidarity into tangible solutions, helping to address the pressing challenges of the digital age.