European Organization for Nuclear Research

 Logo, Crib, Furniture, Infant Bed
Digital Watch Atlas 2.0 member badge

Acronym: CERN

Established: 1954

Address: Esplanade des Particules 1, 1217 Meyrin, Switzerland

Website: https://www.cern.ch/

CERN is widely recognised as one of the world’s leading laboratories for particle physics. At CERN, physicists and engineers probe the fundamental structure of matter that makes up our universe. To do this, they use the world’s largest and most complex scientific instruments – particle accelerators and detectors. Technologies developed at CERN go on to have a significant impact through their applications in wider society.

Digital activities

CERN has had an important role in the history of computing and networks. The World Wide Web (WWW) was invented at CERN by Sir Tim Berners-Lee. The web was originally conceived and developed to meet the demand for automated information-sharing between scientists at universities and institutes around the world.

Grid computing, the precursor of modern cloud computing, was also developed at CERN with partners across a worldwide community and with funding from the European Commission. Today, the Organisation carries out pioneering activities in the areas of cybersecurity, big data processing, long-term data preservation, deep learning (DL) and artificial intelligence (AI), and quantum technologies.

Digital policy issues

Artificial intelligence

AI-related projects are developed and referred to as part of the CERN openlab activities.

Through CERN openlab, European Commission-funded projects and collaborations with other international organisations, CERN collaborates with leading information and communications technology (ICT) companies and research institutes. The R&D projects carried out through different public-private partnerships address topics related to ultra-fast data acquisition, accelerated computing platforms, data storage architectures, computer provisioning and management, networks and communication, deep learning and data analytics, and quantum technologies. CERN researchers use Machine Learning techniques as part of their efforts to maximise the discovery potential and optimise resource usage. ML and DL are used, for instance, to improve the performance of the Large Hadron Collider (LHC) experiments in areas such as particle detection and managing computing resources. Going one step further, at the intersection of AI and quantum computing, the CERN Quantum Technology Initiative is exploring the feasibility of using quantum algorithms to track the particles produced by collisions in the LHC, and is working on developing quantum algorithms to help optimise how data is distributed for storage in the Worldwide LHC Computing Grid (WLCG). The CERN Quantum Technology Initiative (QTI) activities, launched in 2020 to shape CERN’s role in the next quantum revolution. In 2024, CERN launched the Open Quantum Institute, a three-year pilot programme that will help unleash the full power of quantum computing for the benefit of all.

  • CERN openlab: a public-private partnership in which CERN collaborates with ICT companies and other research organisations to accelerate the development of cutting-edge solutions for the research community, including ML.
  • CERN QTI: a comprehensive R&D, academic, and knowledge-sharing initiative to exploit the quantum advantage for high-energy physics and beyond. Given CERN’s increasing ITC and computing demands, as well as the significant national and international interests in quantum-technology activities, it aims to provide dedicated mechanisms for the exchange of both knowledge and innovation.
  • CERN OQI: Following a successful one-year incubation period led by the Geneva Science and Diplomacy Anticipator (GESDA), the three-year CERN-based pilot was launched in March 2024. Proposed, designed, and incubated through GESDA, in collaboration with some 180 experts from all over the world, the OQI is a multilateral science diplomacy initiative, uniting academia, technology companies, the private sector, the diplomatic community, philanthropy organisations, and global citizens in a joint effort towards more open and inclusive quantum computing. By facilitating equal access to cutting-edge nascent technologies and serving as the societal arm of QTI, the OQI seeks to accelerate the potential of quantum computing for all society and to support the development of concrete quantum solutions aimed at achieving the UN sustainable development goals (SDGs).

Next Generation Triggers: The Next Generation Triggers project, or NextGen, started in January 2024 as a collaboration between CERN (the Experimental Physics, Theoretical Physics and Information Technology Departments) and the ATLAS and CMS experiments. The key objective of the five-year NextGen project is to get more physics information out of the HL-LHC data. The hope is to uncover as-yet-unseen phenomena by more efficiently selecting interesting physics events while rejecting background noise. Scientists will make use of neural network optimisation, quantum-inspired algorithms, high-performance computing and field-programmable gate array (FPGA) techniques to improve the theoretical modelling and optimise their tools in the search for ultra-rare events.

Cloud computing

Within its work, CERN refers to ‘cloud computing’ as ‘distributed computing.

The scale and complexity of data from the LHC, the world’s largest particle accelerator, is unprecedented. This data needs to be stored, easily retrieved, and analysed by physicists worldwide. This requires massive storage facilities, global networking, immense computing power, and funding. CERN initially did not have the computing or financial resources to crunch all of the data on-site, so in 2002 it turned to grid computing to share the burden with computer centres around the world. The WLCG builds on the ideas of grid technology initially proposed  by Ian Foster and Carl Kesselman in 1999. The WLCG relies on a distributed computing infrastructure, as data from the collisions of protons or heavy ions are distributed via the internet for processing at data centres worldwide. The approach of using virtual machines was a precursor of the same paradigm used today in cloud computing. Today, CERN is developing new grid and cloud technologies in particular for large-scale AI deployment. It is expected that CERN’s further developments in the field of data processing will continue to influence digital technologies.

CERN has two data centres – one in Meyrin and a second one in Prévessin. The average amount of collision data recorded on disk by the LHC experiments is currently a little under 3 petabytes (PB) per day, which is almost equal to what was recorded in one month during Run 1. 

All data produced at CERN still passes through the Meyrin Data Centre, which is the only facility connected to all experimental sites via ultra-fast optical fibre networks. 

The Worldwide LHC Computing Grid (WLCG) consists of around 170 centres distributed across 40 countries. IN 2025, the WLCG celebrates its first 20 years.

Telecommunication infrastructure

Within its work, CERN refers to ‘telecommunication infrastructure’ as ‘network infrastructure’.

In the 1970s, CERN developed CERNET, a lab-wide network to access mainframe computers in its data centre. This pioneering network eventually led to CERN becoming an early European adopter of Transmission Control Protocol/Internet Protocol (TCP/IP) for use in connecting systems on site. In 1989, CERN opened its first external TCP/IP connections and by 1990, CERN had become the largest internet site in Europe and was ready to host the first WWW server. Nowadays, in addition to the WLCG and its distributed computing infrastructure, CERN is also the host of the CERN Internet eXchange Point (CIXP), which optimises CERN’s internet connectivity and is also open to interested internet service providers (ISPs).

Through the CERN Quantum Technology Initiative, CERN is actively working to deliver more precise frequency signals from national metrology institutes to CERN experiments and beyond, and to improve the reliability of future quantum networks.

Digital standards

Within its work, CERN addresses ‘web standards’ as ‘open science’.

Ever since releasing the World Wide Web software under an open-source model in 1994, CERN has been a pioneer in the open-source field, supporting open-source hardware (with the CERN Open Hardware Licence), open access (with the Sponsoring Consortium for Open Access Publishing in Particle Physics SCOAP3) and open data (with the CERN Open Data Portal). Several CERN technologies are being developed with open science in mind, such as Indico, InvenioRDM, REANA, and Zenodo. Open-source software, such as CERNBox, CERN Tape Archive (CTA), EOS, File Transfer Service (FTS), GeantIV, ROOT, RUCIO, and Service for Web-Based Analysis (SWAN), has been developed to handle, distribute, and analyse the huge volumes of data generated by the LHC experiments and are also made available to the wider society.

Digital tools

Data governance

Within its work, CERN refers to ‘data governance’ as ‘data preservation’.

CERN manages vast amounts of data; not only scientific data, but also data in more common formats such as webpages, images and videos, documents, and more. For instance, the CERN Data Centre processes on average one petabyte (one million gigabytes) of data per day. As such, the organisation notes that it faces the challenge of preserving its digital memory. CERN also points to the fact that many of the tools that are used to preserve data generated by the LHC and other scientific projects are also suitable for preserving other types of data and are made available to wider society.

The CERN Open Data Policy for scientific experiments at the LHC is essential to make scientific research more reproducible, accessible, and collaborative. It reflects the values enshrined in the CERN Convention for more than 60 years and reaffirmed in the European Strategy for Particle Physics (2020), aiming at empowering the LHC experiments to adopt a consistent approach towards openness and preservation of experimental data (applying FAIR standards to better share and reuse data).

EOSC Future is an EU-funded project contributing to the establishment of the European Open Science Cloud (EOSC) to provide a Web of FAIR Data and Services for science in Europe. The implementation of EOSC is based on the long-term process of alignment and coordination pursued by the Commission since 2015.

CERN joined the recently formed EOSC Association in 2020. The EOSC Association is the legal entity established to govern EOSC and has since grown to more than 250 members and observers.

Social media channels

Facebook @cern

Instagram @cern

LinkedIn @cern

X @CERN

YouTube @CERN