Quantcast
Channel: News for CERN Community feed
Viewing all articles
Browse latest Browse all 3399

Environmental awareness: the challenges of CERN’s IT infrastructure

$
0
0
Environmental awareness: the challenges of CERN’s IT infrastructure

Some 90 petabytes of data are produced each year by the LHC experiments, and a further 25 petabytes by other, non-LHC experiments. Processing and storing these vast amounts of data makes CERN one of the most highly demanding computing environments in the research world, and the Organization’s powerful IT infrastructure reaches beyond particle physics to support many other disciplines and fields.

There are several environmental implications relating to the use of digital technologies and IT infrastructures, the main ones being the lifetime of computing equipment and infrastructure and the energy consumption related to data processing and data storage.

In general, the lifetime of an IT server at CERN is four years. However, these lifecycles evolve all the time: only ten years ago the Data Centre hardware had a much shorter lifetime. Since 2012 CERN has been regularly donating computing equipment that no longer meets its highly specific efficiency standards but is still more than adequate for less exacting environments.

Energy consumption related to data processing accounts for some 75% of the energy used for data centre activities at CERN, with data storage representing the rest. The Organization’s IT department operates the CERN Data Centre on the Meyrin site, as well as a second network hub in Prévessin and two modular data centres on a temporary basis. In 2021, CERN’s main data centre had an average power requirement of 4.14 MW for its IT load, representing the rate at which the energy was used, which resulted in a total energy consumption of about 37 GWh over the year. This energy consumption figure includes the cooling needed for the operation of the Data Centre. For reference, it corresponded to around 4% of CERN’s total energy consumption in 2021 (which amounted to 954 GWh).  

The current CERN Data Centre building was constructed in 1971 and has housed a wide variety of equipment over the years, from mainframes to supercomputers and from commodity computing to today’s rack computing. Teams of experts continually examine the status of the computing equipment and carefully plan optimum schedules for maintenance and upgrades, based on detailed calculations related to performance, cost and energy efficiency. Such planning is also carried out for data storage: one of the main advantages of the tape drives used in the Data Centre for long-term storage is that they do not consume energy when not being written to or read from.

A new data centre will soon be constructed on the Prévessin site, scheduled to be operational in the second half of 2023. Energy efficiency is a central consideration in this project, with a plan to include a heat recovery system that could be used to heat buildings on the site. The initial available IT capacity is expected to be 4 MW, with possible upgrades anticipated at a later stage. CERN aims for the new data centre to have a power usage effectiveness (PUE – an indicator used for measuring the energy efficiency of a data centre) of ≤ 1.1. To put this in context, the global average PUE for large data centres is around 1.5, with new data centres typically achieving a PUE between 1.2 and 1.4.  

Today, the Worldwide LHC Computing Grid (WLCG) consists of 170 computer centres in 42 countries. With the CERN Data Centre at its heart, WLCG now provides 1 million computer cores and 1 exabyte of storage. Teams across CERN and the experiments are working hard to modernise code, finding ways to make it run more efficiently on the latest hardware, thus saving resources and energy. The scale of this global network means that small efficiency savings in popular code can make a really big difference.

By the time the High-Luminosity LHC starts up at the end of 2028, the total computing capacity required by the experiments is expected to be 10 times greater than today. A key focus is to establish new innovative approaches to key computing tasks, often based on machine learning and other related technologies, to help reduce the overall amount of computing resources needed and thus play a vital role in reducing energy consumption.

In 2017, the high-energy physics community produced a roadmap for HEP software and computing R&D for the 2020s that explores future software challenges. It discusses improvements in software efficiency, scalability and performance, as well as new approaches, such as modernising code that can run more efficiently on the latest hardware to generate significant energy savings. Building on this roadmap, WLCG and the LHC experiments have developed an HL-LHC computing strategy, the implementation of which is regularly reviewed by the LHC scientific committee (LHCC).

In January 2022, CERN’s IT department launched an impact study on energy consumption for data storage and data processing. The aim is to get more extensive and accurate estimates across different uses and facilities, helping the department to create a more energy-efficient IT infrastructure. The study’s results are expected later this year.

thortalaWed, 03/09/2022 - 11:15
Publication Date

Viewing all articles
Browse latest Browse all 3399

Trending Articles