Protecting critical infrastructure in a fragile cyberspace

Securing Critical Infrastructure in Cyber: Who and How?‘ is the name of one of the main panels at IGF 2024 in Riyadh, where participants discussed the complexities of identifying, securing, and cooperating to protect critical systems from cyber threats. The session, part of the Geneva Dialogue project, focused on safeguarding critical infrastructure from cyber threats and implementing international cyber norms.

The dialogue highlighted the elusive nature of defining critical infrastructure, as interpretations vary widely across nations. ‘Understanding critical infrastructure begins with impact analysis, but what happens if these systems fail?’ noted Nicolas Grunder from ABB, underscoring the need for clarity. Regional interdependencies further complicate matters, as cascading failures in energy, transportation, or cloud services can cripple interconnected sectors, a scenario brought to life through a fictional cyberattack simulation on a cloud provider.

Baseline cybersecurity measures emerged as a priority, focusing on asset inventories, supply chain security, and resilience planning. Kazuo Noguchi of Hitachi America emphasised the mantra of ‘backup, backup, backup’, advocating for distributed systems across regions to mitigate single points of failure. Practical measures like incident response plans, vulnerability management, and operator awareness training were cited as essential components of any security framework.

The role of international cyber norms and confidence-building measures (CBMs) sparked debate. While voluntary, norms such as avoiding attacks on critical infrastructure during peacetime provide a foundation for responsible state behaviour. Yet, as Kaleem Usmani of CERT Mauritius pointed out, ‘Norms reduce risks and foster cooperation, but accountability remains a challenge.’ Regional collaboration, such as harmonised security certifications, was proposed as a pragmatic solution to bridge gaps in global standards.

Amid growing geopolitical complexities, participants called for greater transparency and cooperation. Bushra AlBlooshi from the Dubai Electronic Security Center showcased Dubai’s approach, where interdependencies between sectors like power and transportation are mapped to preempt disruptions. However, securing systems reliant on foreign service providers adds another layer of vulnerability, prompting calls for international agreements to establish untouchable ‘red lines’ for critical infrastructure in peace and war.

Parliamentary panel at IGF discusses ICTs and AI in counterterrorism efforts

At the 2024 Internet Governance Forum (IGF) in Riyadh, a panel of experts explored how parliaments can harness information and communication technologies (ICTs) and AI to combat terrorism while safeguarding human rights. The session, titled ‘Parliamentary Approaches to ICT and UN SC Resolution 1373,’ emphasised the dual nature of these technologies—as tools for both law enforcement and malicious actors—and highlighted the pivotal role of international collaboration.

Legislation and oversight in a digital era

David Alamos, Chief of the UNOCT programme on Parliamentary Engagement, set the stage by underscoring the responsibility of parliaments to translate international frameworks like UN Security Council Resolution 1373 into national laws. ‘Parliamentarians must allocate budgets and exercise oversight to ensure counterterrorism efforts are both effective and ethical,’ Alamos stated.

Akvile Giniotiene of the UN Office of Counterterrorism echoed this sentiment, emphasising the need for robust legal frameworks to empower law enforcement in leveraging new technologies responsibly.

Opportunities and risks in emerging technologies

Panelists examined the dual role of ICTs and AI in counterterrorism. Abdelouahab Yagoubi, a member of Algeria’s National Assembly, highlighted AI’s potential to enhance threat detection and predictive analysis.

At the same time, Jennifer Bramlette from the UN Counterterrorism Committee stressed the importance of digital literacy in fortifying societal resilience. On the other hand, Kamil Aydin and Emanuele Loperfido of the OSCE Parliamentary Assembly cautioned against the misuse of these technologies, pointing to risks like deepfakes and cybercrime-as-a-service, enabling terrorist propaganda and disinformation campaigns.

The case for collaboration

The session spotlighted the critical need for international cooperation and public-private partnerships to address the cross-border nature of terrorist threats. Giniotiene called for enhanced coordination mechanisms among nations, while Yagoubi praised the Parliamentary Assembly of the Mediterranean for fostering knowledge-sharing on AI’s implications.

‘No single entity can tackle this alone,’ Alamos remarked, advocating for UN-led capacity-building initiatives to support member states.

Balancing security with civil liberties

A recurring theme was the necessity of balancing counterterrorism measures with the protection of human rights. Loperfido warned against the overreach of security measures, noting that ethical considerations must guide the development and deployment of AI in law enforcement.

An audience query on the potential misuse of the term ‘terrorism’ further underscored the importance of safeguarding civil liberties within legislative frameworks.

Looking ahead

The panel concluded with actionable recommendations, including updating the UN Parliamentary Handbook on Resolution 1373, investing in digital literacy, and ensuring parliamentarians are well-versed in emerging technologies.

‘Adapting to the rapid pace of technological advancement while maintaining a steadfast commitment to the rule of law is paramount,’ Alamos said, encapsulating the session’s ethos. The discussion underscored the indispensable role of parliaments in shaping a global counterterrorism strategy that is both effective and equitable.

US grants $406 million to boost GlobalWafers production

The US Commerce Department has finalised $406 million in grants to Taiwan’s GlobalWafers to boost silicon wafer production in Texas and Missouri. These funds will support the first large-scale US production of 300-mm wafers, critical components in advanced semiconductors. This initiative is part of the Biden administration’s effort to strengthen the domestic supply chain for chips.

The grant will aid GlobalWafers’ nearly $4 billion investment in building new manufacturing facilities, creating 1,700 construction jobs and 880 permanent manufacturing positions. The company plans to produce wafers for cutting-edge, mature-node, and memory chips in Sherman, Texas, and wafers for defence and aerospace chips in St. Peters, Missouri.

GlobalWafers’ CEO Doris Hsu expressed enthusiasm about collaborating with US-based customers for years to come. Currently, over 80% of the global 300-mm silicon wafer market is controlled by just five companies, with most production concentrated in East Asia.

This funding is part of the $52.7 billion CHIPS and Science Act, aimed at expanding domestic semiconductor manufacturing. Recent grants include $6.165 billion for Micron Technology and significant subsidies for Intel, TSMC, and GlobalFoundries.

TikTok appeals to Supreme Court to block looming US ban

TikTok and its parent company, ByteDance, have asked the Supreme Court to halt a US law that would force ByteDance to sell TikTok by 19 January or face a nationwide ban. The companies argue that the law violates the First Amendment, as it targets one of the most widely used social media platforms in the United States, which currently has 170 million American users. A group of TikTok users also submitted a similar request to prevent the shutdown.

The law, passed by Congress in April, reflects concerns over national security. The Justice Department claims TikTok poses a threat due to its access to vast user data and potential for content manipulation by a Chinese-owned company. A lower court in December upheld the law, rejecting TikTok’s argument that it infringes on free speech rights. TikTok maintains that users should be free to decide for themselves whether to use the app and that shutting it down for even a month could cause massive losses in users and advertisers.

With the ban set to take effect the day before President-elect Donald Trump’s inauguration, TikTok has urged the Supreme Court to decide by 6 January. Trump, who once supported banning TikTok, has since reversed his position and expressed willingness to reconsider. The case highlights rising trade tensions between the US and China and could set a precedent for other foreign-owned apps operating in America.

Hundreds arrested in Nigerian fraud bust targeting victims globally

Nigerian authorities have arrested 792 people in connection with an elaborate scam operation based in Lagos. The suspects, including 148 Chinese and 40 Filipino nationals, were detained during a raid on the Big Leaf Building, a luxury seven-storey complex that allegedly housed a call centre targeting victims in the Americas and Europe.

The fraudsters reportedly used social media platforms such as WhatsApp and Instagram to lure individuals with promises of romance or lucrative investment opportunities. Victims were then coerced into transferring funds for fake cryptocurrency ventures. Nigeria’s Economic and Financial Crimes Commission (EFCC) revealed that local accomplices were recruited to build trust with targets, before handing them over to foreign organisers to complete the scams.

The EFCC spokesperson stated that agents had seized phones, computers, and vehicles during the raid and were working with international partners to investigate links to organised crime. This operation highlights the growing use of sophisticated technology in transnational fraud, as well as Nigeria’s commitment to combating such criminal activities.

US firm buys Israeli spyware company

Florida-based AE Industrial Partners has acquired Israeli spyware company Paragon for an estimated $500 million, with reports suggesting the deal could reach up to $900 million. Paragon, a competitor to NSO Group, is known for providing cybersecurity tools to government agencies that it claims meet “enlightened democracy” standards. The acquisition was completed on 13 December and reportedly approved by both US and Israeli officials.

Paragon, founded in 2019 by former Israeli intelligence officers and backed by ex-Prime Minister Ehud Barak, is merging with Virginia-based cybersecurity firm Red Lattice. This move aims to strengthen the firm’s presence in the global surveillance market. The US subsidiary of Paragon recently signed a one-year contract with US Immigration and Customs Enforcement, reflecting its growing footprint in government cybersecurity services.

The acquisition comes amid tightened scrutiny of spyware technologies after allegations of abuse involving competitors like NSO Group. In 2021, the US added NSO to its trade blacklist, citing its misuse in targeting activists and journalists. Paragon, however, positions itself as a provider of ethically guided surveillance tools, limiting its activities to messaging apps and governmental communications.

Revitalising trust with AI: Boosting governance and public services

AI is reshaping public governance, offering innovative ways to enhance services and restore trust in institutions. The discussion at the Internet Governance Forum (IGF) in Riyadh, moderated by Brandon Soloski of Meridian International, focused on using AI to streamline services like passport processing and tax systems, while also addressing privacy and data sovereignty concerns. Open-source AI was highlighted as a critical tool for democratising access and fostering innovation, particularly in developing nations.

Global regulatory frameworks were a central theme, with panellists underscoring the need for harmonisation to avoid fragmentation and ensure seamless interoperability across borders. Economist and policy analyst at the OECD, Lucia Russo, discussed regulatory approaches such as the EU AI Act, which aims to create a comprehensive legal framework. Brandon Soloski and Sarim Aziz from Meta pointed to the benefits of principle-based frameworks in other regions, which provide flexibility while maintaining oversight. Pellerin Matis, Vice President of Global Government Affairs at Oracle, emphasised the importance of public-private partnerships, which allow governments to leverage private sector expertise and startup innovation for effective AI implementation.

The panellists explored how AI can enhance public services, highlighting its role in healthcare, agriculture, and public safety. Examples included AI-driven tools that improve patient care and streamline food production. However, challenges like data protection, trust in AI systems, and the balance between innovation and regulation were also discussed. Anil Pura, an audience member from Nepal, contributed valuable perspectives on the need for education and transparency to foster public trust.

Transparency and education were recognised as fundamental for building trust in AI adoption. Panellists agreed that ensuring citizens understand how AI technologies work and how their data is protected is essential for encouraging adoption. They called for governments to work closely with civil society and academia to create awareness and promote responsible AI use.

The discussion concluded with a call to strengthen collaborations between governments, private companies, and startups. Brandon Soloski highlighted how partnerships could drive responsible AI innovation, while Pellerin Matis stressed the importance of ethical and regulatory considerations to guide development. The session ended on an optimistic note, with panellists agreeing on AI’s immense potential to improve government efficiency and enhance public trust.

All transcripts from the Internet Governance Forum sessions can be found on dig.watch.

Dynamic Coalitions: Bridging digital divides and shaping equitable online governance

The session ‘Dynamic Coalitions and the Global Digital Compact’ at IGF 2024 in Riyadh highlighted the significant role of Dynamic Coalitions (DCs) in advancing the Global Digital Compact’s (GDC) objectives. Moderated by Jutta Croll, the discussion served as a platform to illustrate the alignment of DC efforts with the GDC’s goals, emphasising the need for broader collaboration and inclusion.

One of the pressing topics addressed was bridging digital divides, as emphasised by June Paris, an experienced nurse engaged in research on nutrition in pregnant women and a business development expert. She underscored the challenges faced by Small Island Developing States (SIDS), noting their increased vulnerability to digital marginalisation. Paris called on DCs to prioritise policies that combat polarisation and promote equitable internet access for underrepresented regions.

The conversation also delved into expanding the benefits of the digital economy. Muhammad Shabbir, a member of the Internet Society’s Accessibility Special Interest Group, a member of the Pakistan ISOC chapter, and a member of the Digital Coalition on Accessibility and Disability (DCAD), detailed the contributions of coalitions like the DC on Financial Inclusion, which advocates for accessible financial services, and the DC on Open Education, which focuses on enhancing learning opportunities. Shabbir also highlighted the DC on Accessibility’s work towards digital inclusivity for persons with disabilities and the DC on Environment’s initiatives to address the environmental impacts of digitalisation.

Founder and investor of the WAF lifestyle app and chair of Dynamic Coalition on Core Internet Values, Olivier Crepin-Leblond, provided insights on fostering safe and inclusive digital spaces, stressing the pivotal work of DCs like the DC on Internet Rights and Principles, which champions human rights online, and the DC on Child Online Safety, which ensures the protection of children in the digital realm. He highlighted the significant proportion of under-18 internet users, linking their rights to the UN Convention on the Rights of the Child.

Data governance and AI regulation also featured prominently. Tatevik Grigoryan, co-chair of Dynamic Coalition on Interoperability, Equitable and Interoperable Data Governance and Internet Universality Indicators, discussed frameworks for responsible data management. At the same time, Yao Amevi Amnessinou Sossou, a research fellow for innovation and entrepreneurship, spotlighted AI-related initiatives. These included tackling gender biases through the DC on Gender and Internet Governance and exploring AI’s potential in healthcare and connected devices through other coalitions. Their contributions underscored the need for ethical and inclusive governance of emerging technologies.

The session’s open dialogue further enriched its value. The lead of three dynamic coalitions – Digital Economy, Digital Health and Environment, Dr Rajendra Pratap Gupta, highlighted the urgency of job creation and digital inclusion, while audience members raised critical points on data integrity and the transformative potential of gamification. Mark Carvell’s (co-moderator of the session) mention of the WSIS+20 Review added a forward-looking perspective, inviting DCs to contribute their expertise to this landmark evaluation.

By showcasing the diverse initiatives of Dynamic Coalitions, the session reinforced their essential role in shaping global internet governance. The call for greater inclusion, tangible outcomes, and multistakeholder collaboration resonated throughout, marking a clear path forward for advancing the GDC’s objectives.

All transcripts from the Internet Governance Forum sessions can be found on dig.watch.

IGF 2024 panellists highlight infrastructure, literacy, and fair digital access

The Internet Governance Forum 2024 (IGF) brought together global stakeholders to discuss the implementation of the Global Digital Compact (GDC), aiming to address digital inequalities and foster cross-sector partnerships. The session spotlighted key challenges such as funding gaps, cultural adaptation of digital initiatives, and sustainability concerns in infrastructure development.

Isabel De Sola from the Office of the Tech Envoy emphasised stakeholder collaboration and revealed plans for an upcoming GDC implementation roadmap. Roy Eriksson, Finland‘s Ambassador for Global Gateway, shared successes from AI strategy projects in African nations, illustrating how capacity-building partnerships can close technology gaps. Kevin Hernandez of the Universal Postal Union presented the Connect.Post programme, which aims to connect global post offices to digital networks by 2030.

Discussions also underscored energy efficiency and sustainability in digital infrastructure. Nandipha Ntshalbu highlighted the need to balance technological growth with environmental considerations. Data governance and cybersecurity frameworks were identified as critical, with Shamsher Mavin Chowdhury stressing the importance of inclusive frameworks to protect the interests of developing countries.

Innovative projects demonstrated local impact, such as Damilare Oydele’s Library Tracker for African libraries and Patricia Ainembabazi’s efforts promoting regional knowledge-sharing platforms. However, Alisa Heaver of the Dutch Ministry of Economic Affairs raised concerns about aligning GDC objectives with existing frameworks to avoid redundancy.

The IGF session concluded with a unified call for continued collaboration. Despite challenges, there was optimism that effective partnerships and targeted initiatives can ensure secure, inclusive, and sustainable digital progress worldwide.

Balancing regulation, innovation, and rights in the digital space

Global experts gathered at the Internet Governance Forum in Riyadh to explore collaborative solutions for addressing online harms, emphasising the importance of multistakeholder approaches. Jordan Hadfield of the FBI highlighted international partnerships like Interpol’s specialist groups and the Violent Crimes Against Children Task Force, while Australia’s Cyber Affairs Ambassador Brendan Dowling stressed government accountability measures, such as social media age restrictions.

Nighat Dad, representing the Oversight Board, called for culturally sensitive content moderation and independence in oversight to ensure balanced regulation. Protecting vulnerable groups, especially children and women, took centre stage.

Dowling shared Australia’s initiative to ban under-16s from social media, while Rajnesh Singh from the APNIC Foundation detailed programs empowering women in Southeast Asia’s tech sector. Nighat Dad highlighted how Meta’s Oversight Board advises on issues like the cultural implications of certain terms, such as the Arabic word ‘Shaheed.’

Parliamentarians Auhoud Al-Shehail (Member of Parliament from the Saudi Shura Council) and Jehad Abdulla Al Fadhel (Second Deputy Speaker of the Shura Council of Bahrain) advocated for intensified penalties against harmful practices and stronger educational campaigns to build digital literacy.

Balancing innovation with regulation was another focus, with Hadfield and Dowling urging proactive ‘safety by design’ principles in technology development. Singh emphasised fostering local innovation over dependency on foreign digital products, while Al-Shehail called for policies that evolve alongside technology.

Closing the digital divide, particularly between developed and developing nations, also emerged as a priority, with the president of Guinea’s parliament emphasising global digital solidarity. The discussion underscored the complexity of online harms and the need for flexible, inclusive solutions that respect diverse cultural contexts.

As Brendan Dowling noted, ‘Safety must be integrated at every stage,’ while Singh stressed, ‘We need creators, not just consumers.’ The consensus was clear – a safer, more equitable digital world can be achieved only through collaboration, innovation, and ongoing dialogue.

All transcripts from the Internet Governance Forum sessions can be found on dig.watch.