MSU Turns to Liquid-Cooled Cluster Supercomputer


Mississippi State University had a need for a powerful and efficient new primary supercomputing system for their High Performance Computing Collaboratory (HPC2) –  a coalition focused on advancing the state of the art in computational science and engineering using high performance computing. They chose the Cray® CS300™ liquid-cooled cluster supercomputer. Nicknamed "Shadow," MSU's new Cray CS300-LC cluster generates 316.1 teraflops of peak performance while using minimal energy. This efficiency is accomplished in part through a hybrid architecture featuring Intel® Xeon® processors and Intel® Xeon Phi™ coprocessors, and because the system uses warm water for cooling. Almost four years ago, the University installed its first chilled ... [ Read More ]

What’s in a (Supercomputer) Name?


Think “discovery” and you can see why NERSC, the National Energy Research Scientific Computing Center, names their supercomputers after famous scientists and inventors. Operated by Lawrence Berkeley National Laboratory and serving the Department of Energy’s Office of Science, NERSC has a long history with high-end Cray supercomputers and using them to drive scientific discovery. Their latest Cray machine “Edison”, a Cray® XC30™ supercomputer named after Thomas Edison, has been up and running for much of 2013 with a formal dedication ceremony held last week at Berkeley Lab in California. Supercomputer predecessors at NERSC include “Hopper”, a Cray® XE6™ named after innovating computer scientist Grace Hopper, and “Franklin”, a Cray® XT4™, ... [ Read More ]

When Applications Go Exascale — The CRESTA Project


Seymour Cray, the pioneer of supercomputing, famously asked if you would rather plough a field with two strong oxen or 1024 chickens. The question has since been answered for us: power restrictions have driven CPU manufacturers away from "oxen" (powerful single-core devices) towards multi- and many-core "chickens." An exascale supercomputer will take this a leap further, connecting tens of thousands of many-core nodes, leaving application programmers with the challenge of efficiently harnessing the computing power of tens of millions of threads. This challenge is not just for the applications themselves, but everything underneath — from the operating and runtime systems, through the communication and scientific libraries to the compilers ... [ Read More ]

Hadoop for Scientific Big Data – Maslow’s Hammer or Swiss Army Knife?

cray-elephant-500X500 (2)

The Law of the Instrument (aka Maslow’s Hammer) is an over-reliance on a familiar tool — or, as my grandfather used to say “when the only tool you have is a hammer, every problem begins to resemble a nail.” I’m sure this principle comes to mind as many professionals within the scientific computing community are being asked to investigate the appropriateness of Hadoop® — today’s most familiar Big Data tool. When used inappropriately, and incorporating technologies not suited for scientific Big Data, using Hadoop may indeed feel like wielding a cumbersome hammer. But when used appropriately, and with a technology stack that’s specifically suited to the realities of scientific Big Data, Hadoop can feel like a Swiss Army knife — a ... [ Read More ]

Infographic: The Cray Transformation

The Cray Transformation

At Cray, we have been building high-performance computing systems for more than 40 years. Over that time, we have stood firm at the cutting edge of technology developments, providing our customers with the tools they need to solve some of the most rigorous computational problems. This infographic exemplifies our transformation and broadened product portfolio to include supercomputing, storage and big data solutions. ... [ Read More ]