European Organization for Nuclear Research

 Logo, Crib, Furniture, Infant Bed
Digital Watch 2.0 member badge

Acronym: CERN

Established: 1954

Address: 1211 Geneva 23, Switzerland

Website: https://www.cern.ch/

CERN is widely recognised as one of the world’s leading laboratories for particle physics. At CERN, physicists and engineers probe the fundamental structure of the universe. To do this, they use the world’s largest and most complex scientific instruments – particle accelerators and detectors. Technologies developed at CERN go on to have a significant impact through their applications in wider society.

Digital activities

CERN has had an important role in the history of computing and networks. The World Wide Web (WWW) was invented at CERN by Sir Tim Berners-Lee. The web was originally conceived and developed to meet the demand for automated information-sharing between scientists at universities and institutes around the world.

Grid computing was also developed at CERN with partners and thanks to funding from the European Commission. The organisation also carries out activities in the areas of cybersecurity, big data, machine learning (ML), artificial intelligence (AI), data preservation, and quantum technology.

Digital policy issues

Artificial intelligence

AI-related projects are developed and referred to as part of the CERN openlab activities.

Through CERN openlab, CERN collaborates with leading information and communications technology (ICT) companies and research institutes. The R&D projects carried out through CERN openlab address topics related to data acquisition, computing platforms, data storage architectures, computer provisioning and management, networks and communication, ML and data analytics, and quantum technologies. CERN researchers use ML techniques as part of their efforts to maximise the potential for discovery and optimise resource usage. ML is used, for instance, to improve the performance of the Large Hadron Collider (LHC) experiments in areas such as particle detection and managing computing resources. Going one step further, at the intersection of AI and quantum computing, CERN openlab is exploring the feasibility of using quantum algorithms to track the particles produced by collisions in the LHC, and is working on developing quantum algorithms to help optimise how data is distributed for storage in the Worldwide LHC Computing Grid (WLCG). This research is part of the CERN Quantum Technology Initiative (QTI) activities, launched in 2020 to shape CERN’s role in the next quantum revolution.

–   CERN openlab: a public-private partnership in which CERN collaborates with ICT companies and other research organisations to accelerate the development of cutting-edge solutions for the research community, including ML.

CERN QTI: a comprehensive R&D, academic, and knowledge-sharing initiative to exploit quantum advantage for high-energy physics and beyond. Given CERN’s increasing ITC and computing demands, as well as the significant national and international interests in quantum-technology activities, it aims to provide dedicated mechanisms for the exchange of both knowledge and innovation.

Cloud computing

Within its work, CERN refers to ‘cloud computing’ as ‘distributed computing.

The scale and complexity of data from the LHC, the world’s largest particle accelerator, is unprecedented. This data needs to be stored, easily retrieved, and analysed by physicists worldwide. This requires massive storage facilities, global networking, immense computing power, and funding. CERN did not initially have the computing or financial resources to crunch all of the data on-site, so in 2002 it turned to grid computing to share the burden with computer centres around the world. The WLCG builds on the ideas of grid technology initially proposed in 1999 by Ian Foster and Carl Kesselman. The WLCG relies on a distributed computing infrastructure, as data from the collisions of protons or heavy ions are distributed via the internet for processing at data centres worldwide. This approach of using virtual machines is based on the same paradigm as cloud computing. It is expected that further CERN developments in the field of data processing will continue to influence digital technologies.

Telecommunication infrastructure

Within its work, CERN refers to ‘telecommunication infrastructure’ as ‘network infrastructure’.

In the 1970s, CERN developed CERNET, a lab-wide network to access mainframe computers in its data centre. This pioneering network eventually led CERN to become an early European adopter of Transmission Control Protocol/Internet Protocol (TCP/IP) for use in connecting systems on site. In 1989, CERN opened its first external TCP/IP connections and by 1990, CERN had become the largest internet site in Europe and was ready to host the first WWW server. Nowadays, in addition to the WLCG and its distributed computing infrastructure, CERN is also the host of the CERN Internet eXchange Point (CIXP), which optimises CERN’s internet connectivity and is also open to interested internet service providers (ISPs).

Digital standards

Within its work, CERN addresses ‘web standards’ as ‘open science’.

Ever since releasing the World Wide Web software under an open-source model in 1994, CERN has been a pioneer in the open-source field, supporting open-source hardware (with the CERN Open Hardware Licence), open access (with the Sponsoring Consortium for Open Access Publishing in Particle Physics SCOAP3) and open data (with the CERN Open Data Portal). Several CERN technologies are being developed with open science in mind, such as Indico, InvenioRDM, REANA, and Zenodo. Open-source software, such as CERNBox, CERN Tape Archive (CTA), EOS, File Transfer Service (FTS), GeantIV, ROOT, RUCIO, and service for web-based analysis (SWAN), has been developed to handle, distribute, and analyse the huge volumes of data generated by the LHC experiments and are also made available to the wider society.

Digital tools

Data governance

Within its work, CERN refers to ‘data governance’ as ‘data preservation’.

CERN manages vast amounts of data; not only scientific data, but also data in more common formats such as webpages, images and videos, documents, and more. For instance, the CERN Data Centre processes on average one petabyte (one million gigabytes) of data per day. As such, the organisation notes that it faces the challenge of preserving its digital memory. CERN also points to the fact that many of the tools that are used to preserve data generated by the LHC and other scientific projects are also suitable for preserving other types of data and are made available to wider society.

The CERN Open Data Policy for scientific experiments at the LHC is essential to make scientific research more reproducible, accessible, and collaborative. It reflects values that have been enshrined in the CERN Convention for more than 60 years that were reaffirmed in the European Strategy for Particle Physics (2020), and aims to empower the LHC experiments to adopt a consistent approach towards the openness and preservation of experimental data (applying FAIR standards to better share and reuse data).

EOSC Future is an EU-funded project that is contributing to establishing the European Open Science Cloud (EOSC) to provide a Web of FAIR Data and Services for science in Europe. The implementation of EOSC is based on the long-term process of alignment and coordination pursued by the Commission since 2015.

CERN joined the recently formed EOSC Association in 2020. The EOSC Association is the legal entity established to govern the EOSC and has since grown to more than 250 members and observers.

Future of meetings

More information about ongoing and upcoming events, you can find on the events page.

Social media channels

Facebook @cern

Instagram @cern

LinkedIn @cern

X @CERN

YouTube @CERN