High-performance computing, data analytics and artificial intelligence are converging, and that’s good news for today’s enterprises.
When people talk about high-performance computing, data analytics and artificial intelligence, they tend to treat this trio of technologies as three separate entities, each living in its own world. While that’s true to some extent, this view of disparate technologies misses the digital-transformation forest for the technology trees. That’s because these three complementary technologies are rapidly converging, and anymore, it’s hard to see where one ends and the other begins.
If HPC, data analytics and AI are so different, then why was it that all of the HPC customers I talked to at SC18 were already doing AI, in addition to data analytics? They have and will continue to provide infrastructure and services for a wide variety of workloads in both research and industry.
HPC, data analytics and AI are all technologies that are designed to unlock the value of data. And they are all converging as enterprises come to understand that analytics and AI are essential tools for solving big‑data problems that require the powerful, scalable compute, networking and storage provided by high performance computing.
Formerly the domain of specialists using expensive, proprietary supercomputers, HPC has entered a new era. Thanks to amazing advances in compute, networking and storage technologies, high performance computing capabilities — and by extension data analytics and AI — are available to organizations using small clusters, workstations and even cell phones. While this changes the game for some of the more traditional applications of HPC in academic and government institutions, it also puts AI within reach for a much wider range of use cases.
So what does HPC mean in this new era?
There was a time when many people thought HPC was synonymous with visualization, modeling and simulation. And in some ways it still means the same. Don’t you want to visualize your data? In AI, we’re creating and training models so machines can simulate human behavior. Hmm… It seems like we’re speaking the same language, or at least focusing on some of the same things, when we talk about AI and HPC.
And then there’s the big data side of the story. Isn’t all of HPC dealing with big data? I mean who wants to visualize a small amount of data or model something that’s, well, statistically insignificant? Maybe for fun. By its very nature, HPC means big data.
And that brings us back to data analytics. When it comes to data, aren’t we all looking at and analyzing data to get the next insight, to make the next discovery? Insight is at the heart of data analytics, and discovery is at the heart of HPC. If you discover a new galaxy, do you gain insights? Of course you do. Discovery and insight go hand in hand, just like HPC, data analytics and AI.
Let’s get down to business.
The convergence of HPC, analytics and AI is a big step forward for today’s businesses — and by that, I mean enterprises of all sizes, institutions, labs and universities. Oh, do you think that universities aren’t businesses? Well, have you paid your kids’ tuition bill lately? You bet academic institutions are in business, and most of them have Microsoft®, VMware® AND research computing clusters.
Universities around the world are partnering with industries and governments to use leading-edge technology — including HPC, analytics and AI — to drive discovery and innovations. For example, the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign maintains many close industry partnerships. Boeing, Caterpillar, Deere, Dow, GE, P&G and Rolls-Royce are all members of NCSA’s Industry Program, which has served nearly 60 percent of manufacturers in the Fortune100.[1]
The University of Michigan supports a great deal of research focused on the automotive and transportation industries. This work includes support for research carried out at the Mcity initiative, a one-of-a-kind urban test facility. At Mcity, industry, government and academia come together to improve transportation safety, sustainability and accessibility for the benefit of society.
Meanwhile, the University of Cambridge is using Hadoop and Openstack to make self-service AI capabilities available to research teams and private industry. And then, of course, enterprises like Mastercard are using AI and powerful HPC systems to protect customers from fraud, while startups like ZIFF Inc. are putting AI to work to gain insight from unstructured image, audio and video data.
In the end, they’re all providing a service that leverages the convergence of HPC, data analytics and AI. Along the way, these businesses are contributing to human progress, helping to shape to a future that will increasingly weave artificial intelligence – or at least machine learning – into our everyday lives.
[1] National Center for Supercomputing Applications, “NCSA Plays Key Role in Digital Manufacturing Lab,” accessed February 2, 2019.