By Lucia Whalen
Medill Reports
Hamburg is home to one of the fastest-thinking supercomputers in the world at the German Climate Computing Center (DKRZ). The supercomputer whizzes through global tsunamis of climate data to develop climate models used in landmark blueprints for the future, including the most recent Intergovernmental Panel on Climate Change (IPCC) report. The German Climate Computing Center is the only high-performance computing center dedicated to climate research in Europe.
Supercomputers are responsible for some of the pioneering breakthroughs in modern science. From biology and space physics to projecting the effects of global climate change, supercomputers are necessary for quantifying the gargantuan mathematical projections and scientific problems assembled by scientists to create models and analyze data. Supercomputers have become an essential tool for climate forecasting because the large quantity of data required to create climate simulations would take years to calculate on a normal computer.
Climatology is unlike other fields of science, where the standard is data collection and experimentation. For several fields in climatology, computer simulations based on data are the only option for creating projections about the future, drawing on research from the past. Simulations are often the best chance to understand processes such as how the ice sheets build up and disappear.
Michael Böttinger, head of visualizations and public relations at the Climate Computing Center, is responsible for creating images out of the data created by the computer simulations such as colorful maps projecting changes in ocean heat and other graphics. According to Böttinger, supercomputer climate modeling is essential if humans are to understand where the planet is heading. Scientists base recommendations for mitigation and response on the predictions.
“If we want to gain a better understanding and be prepared for the changes in the climate that are to come, there’s no other option than to use [supercomputers.] They are the best way to gain a better understanding of the climate system today and of the changes that we have started to impose on the climate system,” Böttinger said.
The German Climate Computing Center is unique in that it offers free resources, including computational time, hard drive space and visualizations to researchers working in the earth sciences field, and scientists can apply to DKRZ for a 12-month use of the supercomputer.
Large-scale supercomputers, first deployed by the Control Data Corporation in 1964, were created in response to a need for faster climate computing and for complex science problems, according to Katherine Riley, the Director of Science at the Argonne Leadership Computing Facility. The Human Genome Project, which was completed in 2003 and shattered previously held beliefs about the makeup of genetics, was analyzed using a supercomputer at Oak Ridge National Laboratory, and likely could not have been completed without the powerful technology due to the huge amount of genetic data required to integrate all aspects of the project and the hundreds of contributors to it.
Outside the world of scientific research, many people may not realize that the average MacBook is inadequate to compute problems on the scale necessary for climate scientists, and supercomputers are essentially made up of thousands of tiny computers.
Supercomputers are constantly evolving and systems need to be replaced at a relatively fast frequency in order to keep up with the competitive development of supercomputing labs.
In the United States, Aurora, the next-generation supercomputer, will be launched at Argonne National Laboratory near Chicago in 2021. While Argonne’s Leadership Computing Center outside of Chicago is not solely focused on climate modeling like the German Climate Computing Center, climate simulations are a major output at Argonne.
Aurora will eclipse any other supercomputer for a time as U.S. universities, researchers and companies reserve time to solve complex climate, energy and health problems. Argonne replaces its systems every five years, in part to maintain recognition as a global competitor in the field of supercomputing. However, the competition may not be a bad thing, encouraging the creation of more high-powered computing systems and, in turn, more accurate climate models.
“It’s a little bit of an arms race between us and other countries, and even internally. Building these systems and being number one [matters] – having these systems of particularly big sizes that are bleeding-edge technology. The reason the facilities are here is to drive the science that’s going to be using them,” Riley said.
The German Climate Computing Center typically has a supercomputing center replacement cycle every five or six years, and new systems are deployed in two phases. The current system installed phase one in 2015 and the second phase in 2016. The first phase of the next replacement will likely go into production in 2021, with the second phase launching the following year.