Why NVIDIA's Acquisition of Run:ai Is a Great Move

Chipbrain3

By: Mary Jander


NVIDIA has closed the acquisition of Run:ai, a six-year-old startup that orchestrates GPU resources for AI workloads. While terms weren’t disclosed, Calcalist maintains the pricetag was approximately $700 million. As part of the deal, perhaps to appease regulators concerned with NVIDIA’s competitive profile, NVIDIA will make the Run:ai software open source.

The deal is significant on several fronts. First, it strengthens NVIDIA’s position as a GPU provider by ensuring customers can maximize the efficiency of its costly components. This should hold even though the Run:ai software will be open sourced, since NVIDIA has had a head start by working with the startup since 2020. Run:ai technology is incorporated in a range of NVIDIA products, including NVIDIA DGX and DGX SuperPOD, NVIDIA Base Command, NGC containers, and NVIDIA AI Enterprise, to name a few.

By grabbing hold of technology that maximizes GPU efficiency, NVIDIA also is helping to quell concerns about its ability to meet demand for its chips, including Blackwell, as NVIDIA suppliers struggle to fulfill orders. In an odd twist, it seems that if efficiency in using each NVIDIA component is optimized, perhaps fewer GPUs will be required but more will be ordered as success follows better manageability.

Another highlight of the deal is its emphasis on Kubernetes, the open-source management system for containerized applications that is at the foundation of cloud computing. Run:ai’s support for all variants of Kubernetes makes it possible for the technology to govern workload deployments by pooling resources, from fractions of GPUs to clusters of shared GPUs.

Run:ai Will Maintain Business As Usual

Run:ai was founded in 2018 by Omri Geller (now CEO) and Ronen Dar (now CTO), who met while pursuing advanced engineering degrees in at Tel Aviv University. Both cofounders emphasized in a blog that the company will continue to operate independently:

“As part of NVIDIA, we are eager to build on the achievements we’ve obtained until now, expand our talented team, and grow our product and market reach. While our colors will change from pink to green, our commitment to our customers, partners, and the broader AI ecosystem remains unwavering.”

Run:ai has 156 employees, according to LinkedIn, the majority of whom are based in Israel, though the company has a New York City headquarters as well. The company raised $118 million in three rounds since its founding. Key investors include Tiger Global Management, Insight Partners, TLV Partners, and S Capital VC.

Run:ai cofounders Omri Geller (left) and Ronen Dar. Source: Run:ai

Why Independence Matters

Run:ai’s ability to function independently, albeit with NVIDIA oversight, is at the heart of this deal. One aspect has to do with competition regulation: In the summer of 2024, NVIDIA faced inquiries by the U.S. Department of Justice regarding whether its acquisition of Run:ai would stifle competition from other companies. Apparently, the upshot has been the open-sourcing of Run:ai software, which levels the playing field a bit.

It's also significant that Run:ai will be maintaining its partnerships, which include strategic alliances with AWS, VMware Tanzu, WEKA, VAST Data, HPE, Dell, and NetApp, among others, besides NVIDIA. Notably absent are direct NVIDIA GPU competitors, including Google and AMD—though Amazon has its own Tranium and Inferencia chips.

Perhaps most notable are the data management and storage vendors on Run:ai’s partnership roster. WEKA and VAST Data, for instance, compete in unstructured data management and storage, while HPE, Dell, and NetApp offer virtual and physical storage solutions. These partnerships point to the vital role data preparation and management plays in GPU-based AI factories.

NVIDIA shares rose on the news and were trading at $139.43, up 2.42 (1.77%) the afternoon following the announcement.

Futuriom Take: Run:ai already contributes essential GPU management to NVIDIA’s products, as well as to those of key partners of both companies. NVIDIA’s head start on implementing Run:ai solutions could alleviate any competition arising from open-sourcing Run:ai’s code.