Cray Logo

blog  facebook  twitter  linkedin  google plus  youtube
HomeSupportCustom EngineeringIndustry SolutionsProgramsAbout Cray
ComputingStorageBig Data
Request Quote spacer
graphic Big Data


Cray® CS300™ Cluster Supercomputers for Hadoop®

High performance clusters supporting hybrid workloads

Cray CS300-AC Cluster Supercomputer

Cray cluster supercomputers are used to address the most demanding computing needs in the scientific community. Increasingly, organizations are also turning to frameworks like Hadoop® to process and analyze large volumes of scientific data generated by traditional HPC workloads — and combining this data with external datasets and real-time data sources such as sensor networks — to aid advanced scientific discovery.

Using Hadoop to process big data requires optimal performance across a wide range of analytic workloads, combined with the ability to scale out infrastructure to handle ever-increasing volumes and types of data. In addition, scientific computing users need to maximize shared computing resources — making the ability to schedule and run analytics workloads on demand, side by side with traditional scientific computing jobs, a key emerging need in supercomputing installations.


The Cray® CS300™ cluster supercomputer series provides the tools necessary to handle mixed workloads, enabling analytics for scientific computing. By combining job scheduling, resource management and Cray’s Advanced Cluster Engine (ACE™) system management software in a single foundation, Cray cluster supercomputers allow users to derive maximum value from Hadoop while retaining the ability to run traditional HPC workloads on the same cluster.

  • Flexible. HPC and analytic workloads leverage Apache™ Hadoop® on the same cluster supercomputer.
  • High Performance. Validated and optimized technologies provide performance for the most demanding hybrid workloads.
  • Reliable. Reliable service levels are built on proven technologies like Cray’s ACE system management software.
  • Maintainable. Systems are easy to maintain and update with entire-solution management tools and services.


Apache Hadoop, Hadoop and the yellow elephant logo are either registered trademarks or trademarks of the Apache Software Foundation in the United States and/or other countries.