Learning About Learning: Artificial Intelligence, Machine Learning & Deep Learning

Artificial intelligence (AI), machine learning (ML) and deep learning (DL): these terms crop up often online and in the popular and technical media. They either represent a tidal wave that’s going to overwhelm every industry and company that doesn’t rapidly adapt or they’re a more incremental innovation in IT. They may usher in new capabilities like speech recognition or image analysis, but at the same time they may also be a little overhyped today.

Which of these scenarios is closer to reality? And while we’re at it, what — if anything — are the differences between AI and machine learning and deep learning? Also, why does any of this matter to someone who’s otherwise interested in high-performance computing and IT at the cutting edge?

To answer the final question first, it matters a great deal if you lend any credence to the industry reports that see AI as a revolution in computing. It’s arguably at least as important as parallel processing or even supercomputing. Some go further and compare AI to electricity — an utterly transformative technology that not only upends the way the world does business, it alters the very fabric of society.

In his new book, The Fourth Age: Smart Robots, Conscious Computers, and the Future of Humanity, futurist Byron Reese takes a bullish view on AI and its cousins machine learning and deep learning (whose differentiating features we’ll come to in a moment). “A few centuries ago, we systematized science and created unimagined prosperity,” he writes. “A few decades ago, we began making mechanical brains [via AI], and just a few years ago, we learned dramatic new ways to make those brains more powerful [via machine learning and deep learning]. It feels as if this very moment is the great inflection point of history.”

However, says Paul Hahn, analytics and AI market manager at Cray, another approach might be to compare AI not to electricity but rather to the advent of the mobile phone.

“Over the long term, maybe ten to twenty years, AI is going to have a disruptive and transformative effect on work and cause displacement,” he says. “I’m hard-pressed to think of industries where it may not have some impact. But I think that in the nearer term, it’s going to change the tools people use for work in every industry — though maybe not the nature of work itself. I think it’s more akin to mobile networks. I think electricity changed the entire standard of living for people who had it. AI, I don’t think, will change standards of living.”

Of the three — artificial intelligence, machine learning and deep learning — AI has the longest history, says Arti Garg, emerging market and technology director at Cray.

“These terms get used really broadly,” she says of AI, ML and DL. But a productive way of visualizing the field, she says, is to imagine a small circle nested inside a slightly larger circle, which is itself nested within an even larger circle.

The largest of those circles is AI. “AI is anything using a machine to do something that human cognition might do,” she says.

“I don’t think people should spend much time worrying about the term AI in general,” says Hahn. “It’s such a broad field. And there are lots of folks focused on advancing the state of the art in AI. … Of more practical interest are two subsets of AI. One is machine learning — which picks up where statistics leaves off. It’s the notion of having a large body of data and using the machine to learn from the data to make predictions. The other is deep learning, which is a subset of machine learning … where you learn to make predictions based on experience.”

Think of machine learning, in other words, as a way of enabling a computer to see patterns in big data sets and then to apply those patterns to new data to draw inferences and act on those inferences.

Then deep learning relies on simulations of the way human brain cells work (a.k.a. neural networks) as one way of implementing machine learning.

In fact, says Aaron Vose, Cray software engineer who’s previously written about machine learning and HPC, the term “deep learning” is a little misleading. Deep learning isn’t so much deep as it’s neural-network based. “Just think ‘neural networks,’” Vose says, “as pretty much all neural networks worth mentioning are ‘deep.’”

At a practical level, Garg notes, deep learning is the most computationally intensive of the three, because deep learning is often used on extremely large data sets that people might not even think of as “data.” Images, video, sounds and free-form text represent typical inputs in a deep learning setting.

Autonomous cars are a now-classic use case in deep learning. Whereas recommender engines on an e-commerce or online video or music platform are more typically machine learning — because the input to the algorithm already exists as traditional data files (e.g., Person A bought items X, Y, and Z; Person B bought…) and do not need as much neural network smarts to be able to be processed.

Deep learning systems, in particular, must be “trained.” That means running the neural network simulations on a broad range of input — for example, teaching an autonomous driving program about traffic cones by feeding them millions of images and videos of cars driving around traffic cones in typical driving situations.

“Preparing data is often the biggest bottleneck for organizations,” Garg says. “So we would like to provide a system that has compute and storage that allows you to prepare your data and then send it to training without having to physically move the data from one system to another. That’s how a lot of organizations have to do deep learning today.”

From making weather forecasts by the minute to finding smarter ways of discovering oil and gas deposits to streamlining the car insurance adjustment process via deep learning assistants, expect much more from these three nested realms of AI in the years ahead.

Hahn says AI, machine learning and deep learning will only be fanning out further and more broadly across industries and use cases.

“It’s hard to imagine any application that won’t have some form of AI embedded in it,” Hahn says. “It’s going to be everywhere.”

Learn more about artificial intelligence, machine learning and deep learning on the Cray website.

Speak Your Mind

Your email address will not be published. Required fields are marked *