Top 5 Blog Posts From 2015

Top-5-Blog-Image

Happy New Year! Check out some of the top blog posts from 2015: Why Do Better Forecasts Matter? Meteorologists have been striving to increase the accuracy of their work since the first weather forecasts were issued in the mid-19th century by pioneers such as Robert FitzRoy. In the 1950s and 1960s, the development of computers and observational technologies enabled forecast generation using the numerical weather prediction theories of Vilhelm Bjerknes and Lewis Fry Richardson. Today the numerical models that support our weather and climate forecasts are true “grand challenge” problems, requiring on the order of 10,000 billion (1016) mathematical operations and the exchange of 150 billion (1.5 x 1014) bytes of information to generate a ... [ Read More ]

Alan Turing Institute Will Lead Research in Data Science

Alan Turing Institute Blog Image

Cray is partnering with the Alan Turing Institute, the new U.K. data science research organization in London, to help the U.K. as it increases research in data science to benefit research and industry. Earlier this month Fiona Burgess, U.K. senior account manager, and I attended the launch of the institute. At the event, U.K. Minister for Science and Universities Jo Johnson paid tribute to Turing and his work. Institute director Professor Andrew Blake told the audience that the Turing Institute is about much more than just big data – it is about data science, analysing that data and gaining a new understanding that leads to decisions and actions. Alan Turing was a pioneering British computer scientist. He has become a household name ... [ Read More ]

Can We Make Petabytes Useful?

TIdal-Wave

A recent article in Nature, “The Power of Petabytes”, by Michael Eisenstein, reviews how exponentially increasing life science data exceeds our present abilities to process and make sense of it. Even while continuing to grow unbounded, data sets are still often not yet large enough to draw convincing conclusions. Computation is one obvious problem. “The computation scales linearly with respect to the number of people,” says Marylyn Ritchie, a genomics researcher at Pennsylvania State University in State College. “But as you add more variables, it becomes exponential as you start to look at different combinations.” To efficiently harness growing computing resources, researchers will need to leverage scalable algorithmic approaches and ... [ Read More ]

From Grand Challenge Applications to Critical Workflows for Exascale | Part I

Grand-Challenge-Blog-Part1

(This is the first in a series of three posts. The second will discuss critical workflows in oil and gas and the life sciences while the last will speculate about machine learning techniques to optimize such workflows). A bit of background In the late ’80s, U.S. federal government agencies became convinced that a substantial effort to fund R&D in high performance computing (HPC) was required to address so-called “grand challenges” — fundamental problems in science and engineering that are ambitious (requiring some advances in science and technology) but achievable. In 1992, the Office of Science and Technology Policy (OST) released a recommendation proposing investments in HPC systems, technology and algorithms, a national ... [ Read More ]

[Infographic] The Emergence of Analytics & Big Data in Baseball

baseball-info-feat-blog

As the economics of baseball have changed with higher salaries and increased player activity, along with public interest in game information, we have advanced from hand-coded historical models to the real-time capture of every movement and action in data collection and the use of advanced analytics have evolved from a part-time hobby of a few die-hard fans into a major business, one where advanced big data technology will become a necessity and not a rarity. Take a look at the infographic below to take a stroll through the history of analytics and big data in baseball.\   ... [ Read More ]