NetApp, Intel Partner On AIPod Mini To ‘Democratize’ Enterprise AI Inferencing
‘We want to make it affordable, easy to deploy, and to certainly scale out on inferencing. The key design point I’d say is that it’s simple to deploy. It requires no specialized data science expertise. And it’s easy to set up. So if I was a customer, this means that I can focus on using AI instead of building it,’ says Jenni Flinders, senior vice president of NetApp’s worldwide partner organization.
NetApp Tuesday unveiled a new converged infrastructure offering developed in conjunction with Intel aimed at addressing the cost and complexity related to the enterprise adoption of AI inferencing.
The new technology, known as the NetApp AIPod Mini, addresses the unique challenges that businesses face when deploying AI, such as cost and complexity at the department and team level, said Jenni Flinders, senior vice president of NetApp’s worldwide partner organization.
This is a first for NetApp, Flinders (pictured above) told CRN.
“This is a new full pre-defined full-stack AI solution specifically tuned for inferencing,” she said. “Why inferencing? Inferencing is where the true AI value is created. And that’s where we have the potential to help enterprises turn a lot of the proprietary data that we see out there into a competitive advantage.”
The launch of the NetApp AIPod Mini will help enterprises enhance their efficiencies and drive true data-based decision making, Flinders said.
“Right now, if you think about it, the whole AI inferencing market is wide open for disruption and growth,” she said. “This market segment is really underserved. So for customers that are moving directly to this value phase of AI, which means they’re looking to buy an AI solution instead of building one from scratch, we’re look at how we help customers really drive AI adoption beyond just core data scientists to include departmental and branch offices in any use case, including departmental RAG (retrieval-augmented generation) and enterprise knowledge management.”
Flinders cited a few example workloads.
“From a legal standpoint, the solution can support automation around document drafting and research, for example to help that vertical become more efficient by producing better stuff quicker,” she said. “From a retail standpoint, teams can use this solution to implement personalized shopping experiences. I love personalized shopping experiences. And it also offers dynamic pricing to create a better customer experience. And when it comes to manufacturing, if users can optimize predictive maintenance and supply chains, that will help reduce costs and increase production.”
The goal of the NetApp AIPod Mini is democratization of enterprise AI, Flinders said.
“We want to make it affordable, easy to deploy, and to certainly scale out on inferencing,” she said. “The key design point I’d say is that it’s simple to deploy. It requires no specialized data science expertise. And it’s easy to set up. So if I was a customer, this means that I can focus on using AI instead of building it.”
The NetApp AIPod Mini is going to enable businesses to interact directly with their business data, Flinders said.
“This is combining generative AI with proprietary information to deliver precise, context-aware insights, which is going to be really important,” she said.
The NetApp AIPod Mini is a reference design that is tested and validated, and is customizable by NetApp channel partners, Flinders said. At the base is NetApp’s Ontap storage software platform that provides security, reliability, and manageability, she said.
It consists of four components: compute, storage, networking, and software on the compute piece, Flinders said. The compute piece can be from any server vendor using Intel sixth-generation Xeon CPUs. On the storage side, it includes the NetApp AFF A20 all-flash array. For networking, customers have a choice of providers. Software is the Open Platform for Enterprise AI, she said.
The components can be integrated by partners in the field, she said.
“The NetApp AIPod Mini with Intel is going to give our partners a clear and easy way to tap into the opportunity that’s been created with AI innovation,” she said. “This is a piece that’s untapped, and we’re giving them something to go after. NetApp is using a meet-in-the-channel model for this, so these products and solutions will be available through distribution and through our usual channel partners. But specifically for our partners, and here’s the game changer: it’s going to help them accelerate time to revenue. It’s going to provide them with a pre-integrated AI solution stack. And this is the part that I love about this: It’s going to help partners expand services and margin through attached consulting, integration, and managed services opportunities. So we’re really opening it up to partners to go beyond just the actual AI mini part itself to offer packaged solutions around this.”
