Artificial Intelligence: Five Trends for 2018

As we start 2018 here at Cray, we believe artificial intelligence and, more specifically, deep learning will continue to dominate the emerging technology landscape conversation (and I know that some might argue block chain technologies have already surpassed AI, but that’s another conversation). Since we have several subject matter experts at Cray, I thought it might be a good idea to reach out to the Cray AI team and ask them, “What is a trend for AI or deep learning you expect in 2018?”

Rangan Sukumar, analytic architect in our CTO office:

Mathematical/algorithmic innovations will dictate hardware/system requirements: Algorithmic cleverness, mathematical approximations and sub-precision tweaks have enabled a 7-10x speed-up, mostly on convolution neural networks. While 10x may be hard to repeat in the next year only based on software modifications, newer algorithms will provide a similar speed up for GANs, LSTMS, RNNs and so forth, and together will guide the requirements for hardware acceleration and scale-up into 2019.

Aaron Vose, Cray AI R&D group:

AI will find newer domains as agriculture, government, etc. find newer ways to generate value from data and AI. To quote Aaron, “This is the #1 most important thing for those in HPC to realize — machine learning is coming for traditional science domains outside of analytics and informatics. Machine learning with neural networks is coming to the physical sciences at scale, and this means high-performance computing. You need a net of size N^2 trained on M data items to find cats in 2D images, but you need a net of size N^3 with M^2 data items to find the binding of drug molecules to proteins. As ML goes into the physical sciences with 3D data, these nets are going to ‘blow up’ in terms of compute resources needed. These will be true HPC applications requiring true supercomputer scale.”

Mike Ringenburg, Cray AI R&D:

The rise of complex and hybrid workflows: The HPC community has been talking about the convergence of traditional simulation, big data analytics and artificial intelligence for several years. Until very recently, however, that has mostly taken the form of single systems being used for both simulation and analytics, or on occasion combined workflows with a discrete simulation phase followed by a discrete analytics or machine learning phase (typically on data generated by the simulation). However, we are now starting to see the emergence of more complex workflow patterns, such as machine learning embedded within simulations (for example, neural networks to replace expensive portions of the simulation), machine learning guiding the progress of simulations, and ensembles of simulations used to generate training sets for machine learning algorithms. In 2018 we expect these more complex workflows to become increasingly important, and new and interesting ways to combine traditional HPC techniques with AI and analytics to emerge.

Allesandro Rigazzi, in our EMEA Research Center:

Adaptation of deep learning algorithms to scientific needs. This includes, for example, uncertainty quantification (which is a growing field itself) of deep learning outputs. This has been the missing link between deep learning and the scientific community, a way to reason about AI results, instead of treating it like a magic and bizarre black box.

Geert Wenes, Solution Architect:

Non-parametric modeling – well established in machine learning – will go deeper and, in conjunction with Gaussian processes, will change dramatically the way neural network design and optimization is performed today. This will have architectural consequences: sparse approximations and approximate inference will herald a “regression to the processor mean” (i.e., x86-hyperscalar rather than GPU-accelerated). In return, however, it will help demystify deep learning by bringing statistical rigor to it and it will stimulate broad R&D and deployment of deep learning in the technical enterprise beyond the big five hyperscale companies (Facebook, Apple, Microsoft, Google and Amazon). 2018: DL meets Occam’s razor!

Thanks to everyone at Cray who responded to my call. It is certainly going to be an interesting year.


  1. 2

    John Thingstad says

    I believe the term artificial Intelligence is outdated. This term evolved in the 1960’s when Marvin Minsky was still trying to develop a chess computer. This is just a heuristic search Each move is assigned values and a alfa-beta cutoff is used to limit the number of moves evaluated.
    Marvin Minsky said “Artificial intelligence is mimicking processes usually associated with cognition, it is not always or even usually simulating processes in the human brain.”
    This has clearly changed. To my mind the correct term is ‘machine intelligence’.
    It may may not be what it is as we struggle to emulate bees, but it is where we are headed.

  2. 4


    Informative article and one of the biggest challenges for artificial intelligence technologies has been understanding directed from the natural language human beings use.

Speak Your Mind

Your email address will not be published. Required fields are marked *