I recently wrote about the conversation our own Brad Chamberlain — project lead for our Chapel effort — had with Brian Dolan, chief scientist and co-founder of an artificial intelligence startup, Deep 6 AI, to understand their interest in the Chapel parallel programming language. What got Brian and Brad talking was Brian’s interest in and use of Chapel for some experimental work in AI. In this second post, I want to share some of Brian’s insights on the value of Chapel as a language for AI.
Deep 6 AI is an artificial intelligence company whose primary mission is to find more patients faster for clinical trials — an important part of drug development and drug discovery. In 2016, Deep 6 AI was chosen by the Cedars-Sinai Accelerator powered by Techstars, and since that time has signed several hospitals and contract research organizations. As Brian tells it, the Deep 6 AI software will accelerate the time it takes to run trials, cutting drug development costs significantly and getting life-saving cures to people faster.
In a nutshell, Brian and Deep 6 AI are developing algorithms that leverage concepts and techniques from different scientific disciplines — quantum mechanics and thermodynamics – to bring a unique solution to a difficult problem. Along the way, he tried several popular machine learning languages, including Python, to express the complex mathematics involved in their approach.
In doing so, Brian reached the point where Python and the community of packages available in the Python ecosystem disappointed in several areas: the underlying mathematics, mechanisms of properly handling datatype mismatches, and the complexity of running on multiple CPUs in parallel.
The search for a solution led Brian to look at the Chapel parallel programming language, designed to improve the programmability of large-scale parallel computer systems. As I wrote before, the primary goal of Chapel is “to increase the productivity of high-end computer users by making it easier to read, modify, port, tune and maintain parallel programs.” It came as no surprise to us that Deep 6 AI could be more productive using Chapel. Cray and the Chapel community expressly developed a language to minimize the complexity of parallel programming on massively parallel systems.
What did surprise us, as Brian explained – Chapel was a good fit to use for an AI application.
AI applications Brian had previously worked on were almost exclusively data-intensive “with a little bit of compute.” The area Deep 6 AI is currently researching involves “the notion of graph entropy or crystal entropy” borrowed from quantum mechanics. Using Chapel, Deep 6 AI was able to dramatically improve the quality of the experiment and reduce their “time to science” by being able to code quickly, iterate on ideas and “actually make qualitative improvements.”
By using Chapel, Brian could leverage technology that hid the complexities of an application that had become not just data-intensive, but also compute- and communication-intensive.