CERN Computing

CERN Computing

Technology, Information and Internet

A central role in the fulfillment of CERN’s mission

About us

CERN is one of the most highly demanding computing environments in the research world. The World Wide Web was originally conceived and developed at CERN to meet the demand for automated information-sharing between scientists in universities and institutes around the world. From software development, to data processing and storage, networks, support for the LHC and non-LHC experimental programme, automation and controls, as well as services for the accelerator complex and for the whole laboratory and its users, computing is at the heart of CERN’s infrastructure. The Worldwide LHC Computing Grid (WLCG) – a distributed computing infrastructure arranged in tiers – gives a community of thousands of physicists near real-time access to LHC data. The CERN data centre is at the heart of WLCG, the first point of contact between experimental data from the LHC and the grid. Through CERN openlab, a unique public-private partnership, CERN collaborates with leading ICT companies and other research organisations to accelerate the development of cutting-edge ICT solutions for the research community. CERN has also established a medium- and long-term roadmap and research programme in collaboration with the high energy physics and quantum-technology research communities via the CERN Quantum Technology Initiative (QTI).

Website
https://home.cern/
Industry
Technology, Information and Internet
Company size
201-500 employees

Updates

  • CERN Computing reposted this

    View organization page for CERN openlab, graphic

    705 followers

    Hear from Ilaria Luise, PhD, CERN openlab Senior Research Fellow, how #CERNopenlab contributes to the AtmoRep/EMP2 project and the impact of this collaboration between CERN, the European Centre for Medium-Range Weather Forecasts and Forschungszentrum Jülich 🚀 This project aims to develop a proof-of-concept for a machine learning-based #DigitalTwin of the atmosphere with environmental applications 🌳 The atmosphere and its dynamics significantly impact human well-being, from agricultural decision-making to policymaking and the renewable energy sector. Accurate and equitable modelling of atmospheric dynamics is critically important to allow for evidence-based decision-making that improves human well-being and minimises adverse impacts for current and future generations. This project is part of CERN openlab commitment to promote and encourage R&D projects with a societal and environmental impact 🎯 Discover more about the project 👇 

  • View organization page for CERN Computing, graphic

    2,276 followers

    How many #Pb of collision #data have #LHC experiments recorded on disk since the beginning of Run3? The first correct reply gets an exclusive CERN IT lanyard 😍 --------- HINT At CERN we are used to dealing with a deluge of data being sent to the data centre. However, the numerical value behind the word “deluge” has significantly evolved over the years. Today, the average amount of collision data recorded on disk by the LHC experiments is a bit less than 3 Pb per day, almost the same amount that was recorded in one month at Run1. To date (1st August 2024), the current 2024 proton run has recorded almost the same amount of data of the whole Run 2 (2015-2018). The forecast for the future runs and, even more, for the future accelerators is a crescendo whose trend is already very clear but whose details will evolve over the years. The key mission of CERN data management strategy is to keep the data safe, provide fast access to it, and keep it the smallest possible, in order for it to occupy the minimum amount of space.

    • No alternative text description for this image
  • View organization page for CERN Computing, graphic

    2,276 followers

    The recent #OpenInfra User Group Meeting welcomed at CERN more than 80 participants from 28 different organisations. The event served as a hub to discuss common areas of interest in the field of open computing infrastructures, share experiences and foster collaboration across the various representatives from academia, research institutions, #cloud providers, #telecommunications, financial companies and other IT firms. The meeting was also an opportunity to highlight the commitment of CERN with the various open-source communities it contributes to, and to confirm the importance of keeping them alive and energised. CERN runs its own #OpenStack private cloud. To know more why it was adopted, how it started and where it is now, you can access directly the presentation made by Jose Castro Leon at the meeting: 👉 https://lnkd.in/ggmu3GPa Further info about the meeting with full programme and links to all presentations: 👉 https://lnkd.in/g75Z8qRm Nice summary about the meeting: 👉 https://lnkd.in/gQQYGMdy (With Enrica Porcari, Arne Wiebalck, Luis Fernández Álvarez and many others. Image courtesy of the organizers)

    • No alternative text description for this image
  • View organization page for CERN Computing, graphic

    2,276 followers

    The #software packages behind some of the greatest #discoveries in high-energy physics were developed by Rene Brun at CERN. In the 70s, 80s and 90s, HBook, HPlot, PAW, ZEBRA were the software packages (mainly written in Fortran) used by physicists for data analysis and event visualisation; GEANT was the simulation software. These days, physicists around the world use ROOT (written in C++) for their analysis, again created by René in the late 90s, and GEANT4 for their simulations. The list of breakthroughs that have been made by physicists using Rene's software packages is amazing: - 1983, discovery of W and Z particles at CERN; HBook and HPlot - 1995, discovery of the top quark at Fermilab; PAW - 90s, confirmation of the existence of three families of neutrinos at LEP (CERN); PAW - 1998, Atmospheric neutrinos oscillations by Super-Kamiokande; PAW/ROOT - 2012, Discovery of the #Higgs Boson at CERN; ROOT ROOT has been used also by scientists involved in astrophysics experiments and by some physicists working in Finance. GEANT has become GEANT4 and it's being used extensively by scientists around the world. All software developed at CERN is open source and designed to be shared. 👉 Find out more about ROOT: https://root.cern/ 👉 Find out more about GEANT4: https://lnkd.in/eZ2Mmc3p 🎥 In the short clip, René discloses how he imagines the future of software tools for physics analysis.

  • View organization page for CERN Computing, graphic

    2,276 followers

    Retrieving and preserving access to #data from experiments that ran in the 90s is "Digital Archaeology", a complex task carried out by passionate experts in the CERN IT department. Unlike letters carved on the Rosetta stone, digital data is not written on a virtually immutable support. Just a few years after it is written, its format becomes obsolete, the readout analysis tools can’t run on computers and the visualisation code no longer works. But data can still contain interesting scientific information that should remain available to future generations of scientists. Thanks to the work of Ulrich Schwickerath, whether you're a researcher, teacher, student or just an interested non-physicist, you can now directly access electron-positron annihilation data of DELPHI, one of the four detectors at #LEP, CERN's previous big accelerator. 👉 Find out more: https://lnkd.in/evanyceF

    • No alternative text description for this image
  • View organization page for CERN Computing, graphic

    2,276 followers

    🚀 Guess what it is! In the coloured photo, Ben Segal is sharing his recollections about RIOS. The black and white photo shows what a RIOS looked like. 👉 What does RIOS stand for and what were RIOS used for in the 70s? -------- HINT In the early 70s, ten RIOS were installed in the CERN campus. At that time, the bulk of the computing power needed by CERN was provided by a CDC7600 (Control Data Corporation) mainframe, one of the first transistorised computers installed at CERN. Thanks to the CDC7600, spark chamber films were automatically scanned and measured in record time.

    • No alternative text description for this image
    • No alternative text description for this image
  • View organization page for CERN Computing, graphic

    2,276 followers

    🚀 Guess what it is! Elena Gazzarrini is holding an object that was used in the late 70s for CERN own internal network. 👉 What object is that? -------- HINT In the late 70s, over 10 years before the large distribution of Internet in Europe, CERN created its own internal network, one of the world's fastest networks. Once fully deployed, CERNET interconnected the central IBM and CDC mainframes to switching node computers, the Modcomp Classics (represented by the small squares on the map, 1980). They were then connected to over 100 mini-computers (represented by the small dots). The data was travelling through thick twisted pair cables for a maximum of 1 km. Longer cables would have caused the loss of data as the resistance in the wires, made of copper, was too strong. All the CERNET protocols were developed independently, so it was hard to scale the data transfers. These home-made objects were big enough for people to print their faces and poems on them. The faces on this board are those of Anthonioz Blanc, J. Joosten, R. Pieters, S. Brobecker. The poem is the following: The Dutchman calls it water The frenchman speaks of l’eau The belgian, who knows both languages, Speaks of waterloo At that time, the recording, on magnetic tape, of raw data generated by physics experiments was still done on minicomputers local to the experiment. The first CERNET user was the European Muon Collaboration experiment NA2, which used CERNET regularly from March 1978 to process data on the IBM 370 mainframe. --------------- Be the first to give the right answer in the comments!

    • No alternative text description for this image
    • No alternative text description for this image
    • No alternative text description for this image
  • View organization page for CERN Computing, graphic

    2,276 followers

    CERN openlab is the public-private partnership created in 2001 at CERN to support innovative projects of computing for science. With its cutting-edge activities, CERN openlab involves leading #ICT companies worldwide as well as other research centres. Every summer, CERN openlab offers students the unique opportunity to improve their computing competences at CERN through the direct interaction with the leading experts in computing for science. 👉 Find out more about CERN openlab: https://openlab.cern/

    View organization page for CERN openlab, graphic

    705 followers

    The 2024 #CERNopenlab Summer Student Programme has started! 🌞 Education is central to CERN openlab's mission to promote the next generation of #Computing experts. 30 students, from 21 nationalities, representing the universities below were selected out of over 6660 applicants! Students will work on cutting-edge projects with our collaborators for nine weeks and gain hands-on experience with the latest computing technologies. Follow us and stay tuned for all the news about the programme! Read more about the 2024 CERN openlab Summer Student Programme and the selected students here: https://lnkd.in/dKD6qK2Z #Technology #Innovation #SummerStudentProgramme University of Warsaw University of Bergen Yale University National Engineering School of Tunis (ENIT), University of Tunis El Manar Wroclaw University of Science and Technology Jaypee Institute Of Information Technology University of Valencia Universitat Politècnica de Catalunya The American University in Cairo Charles III University of Madrid Berea College Ukrainian Catholic University Guru Ghasidas Vishwavidyalaya, Bilaspur University POLITEHNICA of Bucharest Università degli Studi di Milano TU Dortmund University Ecole nationale Superieure d'Informatique (ESI) Universidad de Málaga The University of Edinburgh Cracow University of Technology Pontificia Universidad Católica de Chile AGH University of Krakow Polytech'Montpellier Makerere University Dokuz Eylul University Universidad de Granada UCL Universidad Nacional de Colombia Universidade Estadual de Campinas

    • No alternative text description for this image
  • View organization page for CERN Computing, graphic

    2,276 followers

    The #OpenDataPortal is the platform where scientists but also teachers, students, educators can use openly available #LHC data for their activities. A few months ago, the CMS Collaboration released over 70 TB of 13TeV collision data and 830 TB of corresponding simulations; more recently, the ATLAS Collaboration has released two years of proton-proton collision data. Thanks to the CERN Open Data Portal, the global scientific community and the world physics enthusiasts can explore more than 5 petabytes of #opendata from particle physics. 👉 Find out more on CERN Open Data Portal https://opendata.cern.ch/ 🥇 Tibor Šimko is the CERN-IT Service Manager and Curator for Open Data

    View organization page for ATLAS Collaboration, graphic

    8,217 followers

    The ATLAS Experiment at CERN has made two years of proton-proton collision data from the Large Hadron Collider (LHC) available to the public. This #opendata release includes 65 TB of data from over 7 billion collision events at 13 TeV, plus 2 billion simulated events for analysis. “Open access is a core value of CERN and the ATLAS Collaboration,” says Andreas Hoecker, ATLAS Spokesperson. “Since its beginning, ATLAS has strived to make its results fully accessible and reusable through open access archives. ATLAS has routinely released open data for educational purposes. Now, we’re taking it one step further — inviting everyone to explore the data that led to our discoveries.” 🔍 Researchers and enthusiasts can now access this data under the Creative Commons CC0 waiver. Comprehensive documentation and analysis guides are provided to help users navigate the data. Visit the ATLAS Open data portal to start exploring: opendata.atlas.cern

    ATLAS releases 65 TB of open data for research

    ATLAS releases 65 TB of open data for research

    atlas.cern

  • View organization page for CERN Computing, graphic

    2,276 followers

    Here, the Wi-Fi network coverage is extremely good and the thousands of scientists present everyday at CERN can enjoy the high-quality connectivity across the whole campus. The CERN Network system supports office networking requirements as well as a rapidly growing number of connected devices. It connects dedicated experiment networks to other CERN services and even carries physics data from smaller experiments to the data centre. Finally, it enables users to protect devices from unwanted connections from the general Internet where required. A carefully planned multi-year upgrade project that has concerned both the wired and the Wi-Fi infrastructure has recently come to completion. The upgrade of the infrastructure was essential to support, for example, the use of soft-phones which are now ubiquitous as we’ve recently phased out the 30+ year old phone exchange.

    • No alternative text description for this image

Affiliated pages

Similar pages