AI Blockbuster: Nvidia Buys Run:ai For $700M

‘Run:ai has been a close collaborator with Nvidia since 2020 and we share a passion for helping our customers make the most of their infrastructure,’ says Run:ai CEO Omri Geller.

Inserting image...

AI chip superstar Nvidia is doubling down on becoming the dominant force in the artificial intelligence hardware market by purchasing AI infrastructure management startup Run:ai for $700 million.

Run.ai, which was honored on CRN’s 2024 AI 100 list, is a Kubernetes-based workload management and orchestration software provider, which Nvidia plans to leverage to drive its DGX Cloud business.

“Run:ai has been a close collaborator with Nvidia since 2020 and we share a passion for helping our customers make the most of their infrastructure,” said Run:ai co-founder and CEO Omri Geller in a blog post. “We’re thrilled to join Nvidia and look forward to continuing our journey together.”

[Related: Ex-Nvidia, Apple And Intel Engineers Launch AI Startup FlexAI With $30M Backing]

The Israel-based AI startup has built an open platform on Kubernetes, which is the orchestration layer for AI and cloud infrastructure. It supports all popular Kubernetes variants and integrates with third-party AI tools and frameworks, said Nvidia’s Alexis Bjorlin.

“Run:ai enables enterprise customers to manage and optimize their compute infrastructure, whether on-premises, in the cloud or in hybrid environments,” said Bjorlin, vice president and general manager for DGX Cloud at Nvidia, in a statement Wednesday.

Nvidia’s DGX Plan For Run:ai

Nvidia will continue to offer Run:ai’s products under the same business model for the immediate future.

Santa Clara, Calif.-based Nvidia said it will continue to invest in Run:ai’s product road map as part of Nvidia DGX Cloud, which is an AI platform co-engineered with leading clouds, offering a full-stack service optimized for generative AI.

Nvidia plans to leverage Run:ai customers that include some of the world’s largest enterprises across multiple industries, which use its platform to manage data-center-scale GPU clusters.

Nvidia DGX and DGX Cloud customers will gain access to Run:ai’s capabilities for their AI workloads, particularly for large language model deployments. Run:ai’s offerings are already integrated with Nvidia DGX, Nvidia DGX SuperPod, Nvidia Base Command, NGC containers and Nvidia AI Enterprise software, among other products.

Run:ai’s AI-cluster management platform helps customers speed up development, scale AI infrastructure and lower compute costs, while its AI Workflow management offering helps data teams work with the tools of their choice and quickly spin up a working environment.

“Together with Run:ai, Nvidia will enable customers to have a single fabric that accesses GPU solutions anywhere,” said Bjorlin. “Customers can expect to benefit from better GPU utilization, improved management of GPU infrastructure and greater flexibility from the open architecture.”

Nvidia did not disclose when it expects to finalize the purchase of Run:ai.