International Electrotechnical Commission

The IEC is the world leader in preparing international standards for all electrical, electronic, and related technologies. A global, not-for-profit membership organisation, the IEC provides a neutral and independent institutional framework to around 170 countries, coordinating the work of some 30,000 experts. We administer four IEC Conformity Assessment Systems, representing the largest working multilateral agreement based on the one-time testing of products globally. The members of each system certify that devices, systems, installations, services, and people perform as required.

IEC international standards represent a global consensus of state-of-the-art know-how and expertise. Together with conformity assessment, they are foundational for international trade.

IEC standards incorporate the needs of many stakeholders in every participating country and form the basis for testing and certification. Experts come from both developed and developing countries. Each member country and all its stakeholders represented through the IEC National Committees have one vote and a say in what goes into an IEC international standard.

Our work is used to verify the safety, performance, and interoperability of electric and electronic devices and systems such as mobile phones, refrigerators, office and medical equipment, or electricity generation. It also helps accelerate digital transformation, artificial intelligence (AI), or virtual reality applications, protects information technology (IT) and critical infrastructure systems from cyberattacks and increases the safety of people and the environment.

Digital activities 

The IEC works to ensure that its activities have a global reach to meet all the challenges of digital transformation worldwide. The organisation covers an array of digital policy issues. IEC international standards and conformity assessment play a crucial role in shaping global AI and digital policies by providing a structured, collaborative, and consensus-driven framework that addresses technical, ethical, and governance challenges.

Digital policy issues

Artificial intelligence

AI applications are driving digital transformation across diverse industries, including energy, healthcare, smart manufacturing, transport, and other strategic sectors that rely on IEC Standards and Conformity Assessment Systems. AI technologies allow insights and analytics that go far beyond the capabilities of legacy analytic systems.

For example, the digital transformation of the grid enables increased automation, making it more efficient and able to seamlessly integrate fluctuating renewable energy sources. IEC standards pave the way for the use of a variety of digital technologies relating to intelligent energy. They deal with issues such as the integration of renewable energies within the electrical network but also increased automatisation.

A joint IEC and ISO technical committee on AI, JTC1/SC 42, brings together technology experts, as well as ethicists, lawyers, social scientists, and others to develop generic and foundational standards (horizontal standards). IEC experts focus on sector-specific needs (vertical standards) and conformity assessment.

JTC 1/SC 42 addresses concerns about the use and application of AI technologies. For example, data quality standards for ML and analytics are crucial for helping to ensure that applied technologies produce useful insights and eliminate faulty features.

Governance standards in AI and the big data analytics business process framework address how the technologies can be governed and overseen from a management perspective. International standards in the areas of trustworthiness, ethics, and societal concerns will ensure responsible deployment.

Quantum computing

The joint IEC and ISO technical committee for quantum technologies, IEC/ISO JTC 3, is working on standards for all aspects of quantum, including computing, metrology, sources, detectors, communications and fundamental quantum technologies.

Infrastructure

The IEC develops standards for many of the technologies that support digital transformation. Fibre optic cables, sensors, semiconductors, cloud and edge computing are examples.

Cloud computing

The joint ISO/IEC technical committee prepares standards for cloud computing, including distributed platforms and edge devices. The standards cover key requirements relating to data storage and recovery.

Network security and critical infrastructure

The IEC develops cybersecurity standards and conformity assessment for IT and operational technology (OT). Cybersecurity is often understood only in terms of IT, which leaves critical infrastructure, such as power utilities, transport systems, manufacturing plants and hospitals, vulnerable to attacks.

Digital tools

The IEC has developed a number of online tools and services designed to help everyone with their daily activities.

Find out more
IEC website
IEC news and blog
IEC e-tech

Social media channels

LinkedIn @IECStandards

Facebook @InternationalElectrotechnicalCommission

YouTube @IECstandards

European Organization for Nuclear Research

CERN is widely recognised as one of the world’s leading laboratories for particle physics. At CERN, physicists and engineers probe the fundamental structure of matter that makes up our universe. To do this, they use the world’s largest and most complex scientific instruments – particle accelerators and detectors. Technologies developed at CERN go on to have a significant impact through their applications in wider society.

Digital activities

CERN has had an important role in the history of computing and networks. The World Wide Web (WWW) was invented at CERN by Sir Tim Berners-Lee. The web was originally conceived and developed to meet the demand for automated information-sharing between scientists at universities and institutes around the world.

Grid computing, the precursor of modern cloud computing, was also developed at CERN with partners across a worldwide community and with funding from the European Commission. Today, the Organisation carries out pioneering activities in the areas of cybersecurity, big data processing, long-term data preservation, deep learning (DL) and artificial intelligence (AI), and quantum technologies.

Digital policy issues

Artificial intelligence

AI-related projects are developed and referred to as part of the CERN openlab activities.

Through CERN openlab, European Commission-funded projects and collaborations with other international organisations, CERN collaborates with leading information and communications technology (ICT) companies and research institutes. The R&D projects carried out through different public-private partnerships address topics related to ultra-fast data acquisition, accelerated computing platforms, data storage architectures, computer provisioning and management, networks and communication, deep learning and data analytics, and quantum technologies. CERN researchers use Machine Learning techniques as part of their efforts to maximise the discovery potential and optimise resource usage. ML and DL are used, for instance, to improve the performance of the Large Hadron Collider (LHC) experiments in areas such as particle detection and managing computing resources. Going one step further, at the intersection of AI and quantum computing, the CERN Quantum Technology Initiative is exploring the feasibility of using quantum algorithms to track the particles produced by collisions in the LHC, and is working on developing quantum algorithms to help optimise how data is distributed for storage in the Worldwide LHC Computing Grid (WLCG). The CERN Quantum Technology Initiative (QTI) activities, launched in 2020 to shape CERN’s role in the next quantum revolution. In 2024, CERN launched the Open Quantum Institute, a three-year pilot programme that will help unleash the full power of quantum computing for the benefit of all.

  • CERN openlab: a public-private partnership in which CERN collaborates with ICT companies and other research organisations to accelerate the development of cutting-edge solutions for the research community, including ML.
  • CERN QTI: a comprehensive R&D, academic, and knowledge-sharing initiative to exploit the quantum advantage for high-energy physics and beyond. Given CERN’s increasing ITC and computing demands, as well as the significant national and international interests in quantum-technology activities, it aims to provide dedicated mechanisms for the exchange of both knowledge and innovation.
  • CERN OQI: Following a successful one-year incubation period led by the Geneva Science and Diplomacy Anticipator (GESDA), the three-year CERN-based pilot was launched in March 2024. Proposed, designed, and incubated through GESDA, in collaboration with some 180 experts from all over the world, the OQI is a multilateral science diplomacy initiative, uniting academia, technology companies, the private sector, the diplomatic community, philanthropy organisations, and global citizens in a joint effort towards more open and inclusive quantum computing. By facilitating equal access to cutting-edge nascent technologies and serving as the societal arm of QTI, the OQI seeks to accelerate the potential of quantum computing for all society and to support the development of concrete quantum solutions aimed at achieving the UN sustainable development goals (SDGs).

Next Generation Triggers: The Next Generation Triggers project, or NextGen, started in January 2024 as a collaboration between CERN (the Experimental Physics, Theoretical Physics and Information Technology Departments) and the ATLAS and CMS experiments. The key objective of the five-year NextGen project is to get more physics information out of the HL-LHC data. The hope is to uncover as-yet-unseen phenomena by more efficiently selecting interesting physics events while rejecting background noise. Scientists will make use of neural network optimisation, quantum-inspired algorithms, high-performance computing and field-programmable gate array (FPGA) techniques to improve the theoretical modelling and optimise their tools in the search for ultra-rare events.

Cloud computing

Within its work, CERN refers to ‘cloud computing’ as ‘distributed computing.

The scale and complexity of data from the LHC, the world’s largest particle accelerator, is unprecedented. This data needs to be stored, easily retrieved, and analysed by physicists worldwide. This requires massive storage facilities, global networking, immense computing power, and funding. CERN initially did not have the computing or financial resources to crunch all of the data on-site, so in 2002 it turned to grid computing to share the burden with computer centres around the world. The WLCG builds on the ideas of grid technology initially proposed by Ian Foster and Carl Kesselman in 1999. The WLCG relies on a distributed computing infrastructure, as data from the collisions of protons or heavy ions is distributed via the internet for processing at data centres worldwide. The approach of using virtual machines was a precursor to the same paradigm used today in cloud computing. Today, CERN is developing new grid and cloud technologies in particular for large-scale AI deployment. It is expected that CERN’s further developments in the field of data processing will continue to influence digital technologies.

CERN has two data centres – one in Meyrin and a second one in Prévessin. The average amount of collision data recorded on disk by the LHC experiments is currently a little under 3 petabytes (PB) per day, which is almost equal to what was recorded in one month during Run 1. 

All data produced at CERN still passes through the Meyrin Data Centre, which is the only facility connected to all experimental sites via ultra-fast optical fibre networks. 

The Worldwide LHC Computing Grid (WLCG) consists of around 170 centres distributed across 40 countries. IN 2025, the WLCG celebrates its first 20 years.

Telecommunication infrastructure

Within its work, CERN refers to ‘telecommunication infrastructure’ as ‘network infrastructure’.

In the 1970s, CERN developed CERNET, a lab-wide network to access mainframe computers in its data centre. This pioneering network eventually led to CERN becoming an early European adopter of Transmission Control Protocol/Internet Protocol (TCP/IP) for use in connecting systems on site. In 1989, CERN opened its first external TCP/IP connections and by 1990, CERN had become the largest internet site in Europe and was ready to host the first WWW server. Nowadays, in addition to the WLCG and its distributed computing infrastructure, CERN is also the host of the CERN Internet eXchange Point (CIXP), which optimises CERN’s internet connectivity and is also open to interested internet service providers (ISPs).

Through the CERN Quantum Technology Initiative, CERN is actively working to deliver more precise frequency signals from national metrology institutes to CERN experiments and beyond, and to improve the reliability of future quantum networks.

Digital standards

Within its work, CERN addresses ‘web standards’ as ‘open science’.

Ever since releasing the World Wide Web software under an open-source model in 1994, CERN has been a pioneer in the open-source field, supporting open-source hardware (with the CERN Open Hardware Licence), open access (with the Sponsoring Consortium for Open Access Publishing in Particle Physics SCOAP3) and open data (with the CERN Open Data Portal). Several CERN technologies are being developed with open science in mind, such as Indico, InvenioRDM, REANA, and Zenodo. Open-source software, such as CERNBox, CERN Tape Archive (CTA), EOS, File Transfer Service (FTS), GeantIV, ROOT, RUCIO, and Service for Web-Based Analysis (SWAN), has been developed to handle, distribute, and analyse the huge volumes of data generated by the LHC experiments and are also made available to the wider society.

Digital tools

Data governance

Within its work, CERN refers to ‘data governance’ as ‘data preservation’.

CERN manages vast amounts of data; not only scientific data, but also data in more common formats such as webpages, images and videos, documents, and more. For instance, the CERN Data Centre processes on average one petabyte (one million gigabytes) of data per day. As such, the organisation notes that it faces the challenge of preserving its digital memory. CERN also points to the fact that many of the tools that are used to preserve data generated by the LHC and other scientific projects are also suitable for preserving other types of data and are made available to wider society.

The CERN Open Data Policy for scientific experiments at the LHC is essential to make scientific research more reproducible, accessible, and collaborative. It reflects the values enshrined in the CERN Convention for more than 60 years and reaffirmed in the European Strategy for Particle Physics (2020), aiming at empowering the LHC experiments to adopt a consistent approach towards openness and preservation of experimental data (applying FAIR standards to better share and reuse data).

EOSC Future is an EU-funded project contributing to the establishment of the European Open Science Cloud (EOSC) to provide a Web of FAIR Data and Services for science in Europe. The implementation of EOSC is based on the long-term process of alignment and coordination pursued by the Commission since 2015.

CERN joined the recently formed EOSC Association in 2020. The EOSC Association is the legal entity established to govern EOSC and has since grown to more than 250 members and observers.

Social media channels

Facebook @cern

Instagram @cern

LinkedIn @cern

X @CERN

YouTube @CERN