[Infographic] Risk Management & HPC

FS-Blog

In the post-2008 credit crisis world, risk evaluation by financial institutions has steadily increased in demand and complexity. Where large banks once performed risk analysis on a monthly basis, they’re now performing these calculations at least daily, if not in near-real-time. Cray makes this possible with the embedded Aries network, which eliminates the performance and latency issues of more traditional “fat tree” approaches, as well as the DataWarp™ applications I/O accelerator which effectively removes data access bottlenecks. See the infographic below for a look at Cray’s approach compared to legacy approaches – and to understand the difference in value. ... [ Read More ]

5 Predictions of Where Supercomputing is Heading in 2016

supercomputing blog icon

From new processor technologies to quantum computing, 2016 promises to be another exciting year for supercomputing. Here are five predictions as to how the industry will push ahead in 2016:    The year of progress in new processor technologies There are a number of exciting technologies we should see in 2016, and a leader will be Intel’s next-generation Xeon Phi coprocessor – a hybrid between an accelerator and general purpose processor. This new class of processors will have a large impact on the industry with its innovative design that combines a many-core architecture with general-purpose productivity. Cray, for example, will be delivering Intel Xeon Phi processors with some of our largest systems, including those going to Los ... [ Read More ]

Top 5 Blog Posts From 2015

Top-5-Blog-Image

Happy New Year! Check out some of the top blog posts from 2015: Why Do Better Forecasts Matter? Meteorologists have been striving to increase the accuracy of their work since the first weather forecasts were issued in the mid-19th century by pioneers such as Robert FitzRoy. In the 1950s and 1960s, the development of computers and observational technologies enabled forecast generation using the numerical weather prediction theories of Vilhelm Bjerknes and Lewis Fry Richardson. Today the numerical models that support our weather and climate forecasts are true “grand challenge” problems, requiring on the order of 10,000 billion (1016) mathematical operations and the exchange of 150 billion (1.5 x 1014) bytes of information to generate a ... [ Read More ]

Scratching an Itch: Mitigating HPC Scratch Bottlenecks

Scratching-Itch-Blog

“I have a simple philosophy. Fill what's empty. Empty what's full. And scratch where it itches.” - Alice Roosevelt Longworth When one orchestrates HPC operations, scratch space often comes to the fore as a limiting factor that is often too small, big or slow. Many types of jobs — seismic, life sciences, financial services, etc. — don’t do all their work purely in memory. Each of many workers loads an initial configuration and reads/writes many intermediate points to scratch storage — often shared — before the total job completes. What can easily happen is that huge I/O problems replace huge computational problems as bottlenecks shift, and scratch management is a common issue. Traditional HPC systems often have local storage on each ... [ Read More ]

Controlling the Tide of Weather and Climate Data with Tiered Storage

TAS-Weather-blog

There’s an old cliché that everyone talks about the weather, but no one does anything about it. While Cray can’t (yet) prevent droughts or cool off hot spells, we can help make the lives of weather professionals easier. An abundance of data riches, but where to store it? Weather and climate modeling centers strive to improve the accuracy of their models by gathering and assimilating more diverse input data and by increasing model resolution and complexity. As a result, these models are ingesting and producing ever-increasing volumes of data. These weather and climate organizations often find themselves challenged by the sheer volume of data, trying to manage various ways it may be used and simply trying to find the resources, ... [ Read More ]