New Relic Looks To Speed AI Adoption With Nvidia NIM Integration

By linking its AI monitoring tools – part of the New Relic Observability Platform – with Nvidia NIM, organizations can gain visibility into the “black box” operations of an AI application stack and improve AI operational performance and ROI.

New Relic has integrated its observability software with Nvidia’s NIM inference microservices, making it possible to utilize New Relic’s AI monitoring tools to gain “comprehensive visibility” over generative AI applications built for the Nvidia AI stack.

New Relic says the newly completed integration will help organizations reduce the complexity and costs of developing, deploying and monitoring GenAI applications, accelerating AI adoption and achieving faster ROI.

“It’s all about managing quality and performance in building these applications,” said Jemiah Sius, senior director of developer relations at New Relic, in an interview with CRN.

[Related: New Relic Hires Former Proofpoint Chief Exec As New CEO]

Nvidia NIM microservices, launched in March, provides a way for businesses to build and deploy generative AI copilots and custom applications on the Nvidia platform. The goal was to “lower the barrier to entry” for developing AI software, said Sius (pictured above).

But Sius noted that developers still face challenges in building AI software and the large language models that support them. Developers need to ensure that AI applications protect data security and adhere to privacy policies, for example, and meet customer performance and satisfaction requirements. And generative AI applications are still prone to providing incorrect results or “hallucinations” – a problem that can reduce trust in AI systems.

“It’s kind of a black box on performance. AI applications don’t operate in the same way that a traditional or micro service-based application does,” Sius said.

The New Relic-Nvidia NIM integration is the latest step in the evolution of the two companies’ relationship. In March New Relic was among the first companies listed among Nvidia’s AIOPs partner ecosystem.

The integration of Nvidia NIM with New Relic AI monitoring, part of the company’s observability platform, provides “a holistic, real-time view of the AI application stack across services, Nvidia GPU-based infrastructure and the AI layer,” New Relic said in a press release.

New Relic AI monitoring provides a comprehensive view of the AI stack, along with key metrics on throughput, latency, and costs while ensuring data privacy, according to the company. It also traces the request flows across services and models to understand the inner workings of AI apps.

The integration of New Relic monitoring with Nvidia NIM supports a wide range of AI models including Databricks DBRX, Google's Gemma, Meta’s Llama 3, Microsoft's Phi-3, Mistral Large and Mixtral 8x22B, and Snowflake's Arctic, the company said.

“With this integration, we’re going to provide that mission-critical observability data that you need to do things like manage costs, to deploy [AI applications] effectively, to understand the performance of your models, to give you visibility into your full tech stack and show you what consumption usage looks like, what the entire trace in requests looks like, so you can understand how to better provide a customer experience that is of high quality,” Sius said.

Key features of the New Relic-Nvidia integration include full stack visibility, deep trace insights for every response, model inventory, model comparison, deep GPU insights and enhanced data security.

Sius said New Relic’s partners, including systems integrators and solution providers, will benefit from the New Relic-Nvidia NIM integration.

“This integration is going to help our partners and our customers get to faster adoption of AI, help them get out to the market quicker,” he said. “It provides this simplified setup with NIM. You’re helping to ensure data security. There are benefits for partners…the same benefit if it was a large-scale enterprise company with their internal engineers developing, let's say, an LLM [large language model] or a copilot. There's going to be benefits on both sides.”

“In today’s hypercompetitive market, organizations cannot afford to wait years for AI ROI,” New Relic CEO Ashan Willy said in a statement. “Observability solves this by providing visibility across the AI stack. We are pioneering AI observability by extending our platform to include AI apps built with Nvidia NIM. Combining Nvidia’s AI technology with our expertise in observability and APM [application performance management] gives enterprises a competitive edge in the AI race.”