The Critical Role of Supercomputers in Weather Forecasting

Weather QA

Weather and climate shape economies and infrastructures. From food supply to recreational activities to energy resources, they touch upon nearly every aspect of our daily lives. Unfortunately, weather and climate also cause great losses, both human and economic.  In 2012, losses from natural disasters and extreme weather events totaled about $160 billion — underscoring the need for continued advancements in observational and predictive capabilities.

Supercomputing plays an enormous part in the weather forecasting segment, particularly as the practice evolves to adapt to constantly changing weather patterns. Per Nyberg, director of business development with Cray, has worked extensively with leading weather and climate centers worldwide.  He recently discussed how supercomputing has helped change the game for weather forecasters.

How is the weather forecasting industry changing in response to climate change and the recent frequency of extreme storms in many parts of the world?

The socio-economic impacts of improved observational and predictive capabilities are well recognized and the meteorological community is truly international and close-knit. Predictive capabilities have been continuously improving through the combined efforts of the World Meteorological Organization, national meteorological and hydrological services and researchers in government and academia. Improved simulation accuracy and execution time are vital and the meteorological community invests in observational and computing infrastructure necessary to support increasingly sophisticated forecast models.

How is the supercomputing industry responding to new demands in the weather forecasting sector?

Meteorology continues to push the limits of supercomputing. Scientific objectives have always outpaced the available computational infrastructure. The largest operational centers will soon be surpassing a petaflop (quadrillion floating point operations per second) of peak performance.   Weather forecasting centers are faced with increasingly complex and energy-intensive infrastructure that is increasingly complex and energy demanding. In addition, environmental observational data is one of the “Big Data” deluge areas that is growing exponentially. The challenge to the supercomputing industry is to provide tightly integrated solutions that address not just computing requirements, but data movement, data management and total cost of ownership. Providing such a solution depends on more than just technology—it requires the experience of architecting and delivering systems at scale and complexity.

What type of simulations and modeling techniques are most important for weather forecasting?

Weather forecasting uses a number of different types of simulations, modeling techniques and components at various spatial and temporal scales. At a high level a typical workload is composed of data assimilation, deterministic forecast models and ensemble forecast models. In addition, specialized models may be applied for areas such as extreme events, air quality, ocean waves and road conditions.

Meteorological models are data-intensive. Since the first weather forecasts via computer in the 1950’s, an increasingly comprehensive and explicit representation of the Earth’s physical processes, increased observational capabilities, and increased model resolution has been constant drivers. What is challenges scientists and forecasters today is that each dimension is important. For example, higher spatial resolution models have been identified as a critical element to better estimating the long-term state of climate systems and to improving weather forecasting, particularly for severe weather events.

What high-performance computing architectures are best suited for contemporary weather forecasting requirements?

The computational and data requirements for simulations of appropriate spatial and temporal scales are immense. Climate and weather models are typically constrained by memory and interconnect latencies and bandwidths. Potential or actual model performance is not based on the simple peak performance of the system. The issue is stagnating microprocessor clock speeds – it’s forcing them to scale their models to use thousands of processors in parallel to meet necessary performance levels to run forecasts within operational constraints. Ultimately, a scalable computing architecture that focusses on computationally intensive and efficient data movement will be best suited. Equally important to hardware technologies is software. It is also important to mention the essential role of scalable operating systems, compilers and, application libraries in achieving the highest sustained performance.

How important are storage and networking requirements in weather forecasting centers?

Data movement and data management are critical aspects in the infrastructure — from supporting real-time observational data streams and parallel I/O performance for simulations running on thousands of cores, to archival and stewardship of model generated and observational data. The storage infrastructure connected to the supercomputers can also be quite complex due to requirements for multiple redundant systems and emergency failover. The data expertise within the meteorological community is very sophisticated.

How do you see weather forecasting needs and supercomputing capabilities aligning during the next five years?

The next few years are going to be very exciting for the meteorological and supercomputing communities.  Meteorology has always pushed the limits of supercomputing.   Looking forward, larger computational capabilities will support the routine execution of very high resolution forecast and earth system climate models, resulting in improved forecasting capabilities. From a data perspective, the science and engineering community as a whole is augmenting simulation-based approaches with data-driven models for predictive modeling and knowledge discovery. The meteorological community is ideally positioned to continue its influential role in this next stage of supercomputing.

Per Nyberg, Director of Business Development 

Per Nyberg_Cropped

Speak Your Mind

*

(won't be published)