[Infographic] Risk Management & HPC

FS-Blog

In the post-2008 credit crisis world, risk evaluation by financial institutions has steadily increased in demand and complexity. Where large banks once performed risk analysis on a monthly basis, they’re now performing these calculations at least daily, if not in near-real-time. Cray makes this possible with the embedded Aries network, which eliminates the performance and latency issues of more traditional “fat tree” approaches, as well as the DataWarp™ applications I/O accelerator which effectively removes data access bottlenecks. See the infographic below for a look at Cray’s approach compared to legacy approaches – and to understand the difference in value.   ... [ Read More ]

5 Predictions of Where Supercomputing is Heading in 2016

supercomputing blog icon

From new processor technologies to quantum computing, 2016 promises to be another exciting year for supercomputing. Here are five predictions as to how the industry will push ahead in 2016:    The year of progress in new processor technologies There are a number of exciting technologies we should see in 2016, and a leader will be Intel’s next-generation Xeon Phi coprocessor – a hybrid between an accelerator and general purpose processor. This new class of processors will have a large impact on the industry with its innovative design that combines a many-core architecture with general-purpose productivity. Cray, for example, will be delivering Intel Xeon Phi processors with some of our largest systems, including those going to Los ... [ Read More ]

Top 5 Blog Posts From 2015

Top-5-Blog-Image

Happy New Year! Check out some of the top blog posts from 2015: Why Do Better Forecasts Matter? Meteorologists have been striving to increase the accuracy of their work since the first weather forecasts were issued in the mid-19th century by pioneers such as Robert FitzRoy. In the 1950s and 1960s, the development of computers and observational technologies enabled forecast generation using the numerical weather prediction theories of Vilhelm Bjerknes and Lewis Fry Richardson. Today the numerical models that support our weather and climate forecasts are true “grand challenge” problems, requiring on the order of 10,000 billion (1016) mathematical operations and the exchange of 150 billion (1.5 x 1014) bytes of information to generate a ... [ Read More ]

Can We Make Petabytes Useful?

TIdal-Wave

A recent article in Nature, “The Power of Petabytes”, by Michael Eisenstein, reviews how exponentially increasing life science data exceeds our present abilities to process and make sense of it. Even while continuing to grow unbounded, data sets are still often not yet large enough to draw convincing conclusions. Computation is one obvious problem. “The computation scales linearly with respect to the number of people,” says Marylyn Ritchie, a genomics researcher at Pennsylvania State University in State College. “But as you add more variables, it becomes exponential as you start to look at different combinations.” To efficiently harness growing computing resources, researchers will need to leverage scalable algorithmic approaches and ... [ Read More ]

Controlling the Tide of Weather and Climate Data with Tiered Storage

TAS-Weather-blog

There’s an old cliché that everyone talks about the weather, but no one does anything about it. While Cray can’t (yet) prevent droughts or cool off hot spells, we can help make the lives of weather professionals easier. An abundance of data riches, but where to store it? Weather and climate modeling centers strive to improve the accuracy of their models by gathering and assimilating more diverse input data and by increasing model resolution and complexity. As a result, these models are ingesting and producing ever-increasing volumes of data. These weather and climate organizations often find themselves challenged by the sheer volume of data, trying to manage various ways it may be used and simply trying to find the resources, ... [ Read More ]