For Some Partners, AI Turbocharges Private Cloud Business

'There is a certain amount of cloud repatriation that is happening, and that's happening purely from a cost perspective,' Cognizant’s Naveen Sharma tells CRN in an interview.

Naveen Sharma compares the classic public cloud vendor business model to buying wine. The seller loves for a customer to try a glass, and then maybe drink another, when buying the bottle was probably the better decision for the connoisseur’s wallet.

But now that the world is about 20 years into the cloud technologies era, some customers are turning their noses at the consumer-on-demand public cloud model in favor of private cloud on on-premises, Sharma – senior vice president and global head of AI and analytics at Teaneck, N.J.-based Cognizant – told CRN in an interview.

And the generative artificial intelligence (GenAI) revolution has customers even more mindful of what IT investments will make the most sense for performance and budget – especially as AI vendors continue to experiment with AI business models ranging from per-seat to consumption-based.

“That cost pressure is absolutely real,” said Sharma, whose company ranks No. 8 on CRN’s 2024 Solution Provider 500. “There is a certain amount of cloud repatriation that is happening, and that's happening purely from a cost perspective.”

[RELATED: Nvidia CEO Jensen Huang: HPE-Nvidia Is A ‘Massive Partnership,’ Exits Sphere Stage With A ‘Go HPE!’]

AI Private Cloud Business

Flexera’s 2024 “State of the Cloud” report, which surveyed over 750 IT professionals and executive leaders mostly representing enterprise companies headquartered in the U.S., found that public cloud spend was over budget by an average of 15 percent.

But a growing interest in private cloud does not spell the death of public cloud by any means. In June, market research firm IDC reported that spending on shared, public cloud infrastructure reached $26.3 billion in the first quarter of 2024, increasing 44 percent year over year – compared to dedicated, private cloud infrastructure growth of 15.3 percent year over year to $6.7 billion.

The solution providers who spoke with CRN said that AI has boosted their overall cloud businesses – with innovations from hardware vendors, economics and other factors sometimes tipping customers in favor of a private cloud and on-premises data center as part of the IT environment instead of relying on public cloud.

“This renewed interest in cloud, this renewed interest in data center – it's really blossoming into a renewed interest in IT,” Neil Anderson, vice president of cloud, infrastructure and AI solutions at Maryland Heights, Mo.-based World Wide Technology – No. 7 on CRN’s 2024 Solution Provider 500 – told CRN in an interview.

Flexera’s report even found that respondents remain bullish on cloud, with 31 percent expecting spend to increase in the next 12 months and estimated wasted cloud spend in infrastructure-as-a-service (IaaS) and platform-as-a-service (PaaS) has been trending down after reaching a high of 32 percent in 2022.

The Argument For Private Cloud

Anderson said that WWT has seen “a resurgence in private data center spend because customers are absolutely determined – if I'm going to run AI against my most sensitive intellectual property, I'm going to run it on prem in my own data center. I want to control it.”

Some customers have already spent a lot of money on legacy technologies and don’t feel that they have achieved full return on investment (ROI), Carm Taglienti, chief data officer and data and AI portfolio director at Chandler, Ariz.-based Insight – a member of CRN’s 2024 MSP 500 – told CRN in an interview.

“You have the buildings. You have the racks. You have the investments in software licensing that you've already done,” he said. “So it's like, do you just continue down that path? … Not everything is a blank canvas at the end of the day.”

Private cloud also gives customers more control over performance, security and service-level agreements (SLAs). And for an AI project, it is possible a customer does not or cannot move data from where it resides.

“Some organizations really look at it and say, ‘We're spending so much money on infrastructure-as-a-service with companies like Amazon that we want to basically just provide those services ourselves,” he said. “I wouldn't say a lot of companies are doing it, but companies that are big enough and they can really see value, they can do that, too.”

Cognizant’s Sharma said that for customers interested in using AI with sensitive data – life sciences firms trying to protect proprietary drug discovery processes and financial services clients working on fraud detection, for example – might need to pursue private cloud because of regulation or liability.

Customers have become cautious with scaling technology projects thanks to surprising cloud bills from the past and the growing cost of cloud.

“One of the things that the cloud promised – if you go back 10 or 15 years ago – was, not only do you get unlimited scaling, but you only pay for what you use,” Sharma said. “Companies very quickly found out that, because there were no checks and balances, they ended up using a lot more than they thought they would. ... Now they're being very careful about engineering solutions that could potentially lead to a massive bill at the end of the month or the end of the quarter.”

Hybrid Setups

The Flexera report found that cloud users are growing more interested in multi- and hybrid setups. The number of respondents using multiple public and one private cloud increased from 19 percent to 23 percent year over year, with failovers and app siloing on purpose two reasons why.

WWT’s Anderson (pictured below) said that clients could employ different cloud types depending on the scenario. A single client could leverage public cloud for an AI-powered contact center chatbot used by that client’s customers, for example. The same client might use on-premises wares for an AI model capturing sensitive bank data for an algorithm.

Different AI use cases will have their own latency needs as well. If a customer is employing computer vision, analysis needs to happen as close to the cameras as possible to avoid a massive data backhaul. For AI model tuning, users will want to thread data into GPUs as fast as possible. But then a client might say OK to an employee chatbot taking a few seconds to answer a question or respond to a prompt, he said.

Insight’s Taglienti said that a lack of customer employees who can man and maintain data centers and on-premises equipment could be a factor in not adopting private cloud. The hodgepodge of IT environments has also led to an increased interest in containers and distributed apps to “run your workload wherever is most appropriate for your particular use case,” he said.

Factors outside of AI have also led to customers to reevaluate their data centers, including changes with legacy virtualization vendors Citrix and VMware. “It reopens this whole conversation,” he said.

Hardware vendors have made big strides in making on-premises infrastructure easier to use so that customers might consider private cloud and on-premises data centers, Taglienti said.

“The service orientation and the prescribed architectural approaches are starting to come out of the Dells and Lenovos and HPs of the world,” he said. “Nvidia has their AI development stack. And it'll run a GPU on a Dell or an HP or wherever the heck you want to run it. So that is an interesting way to think about the blurring of the lines. … I don't think the distinction between cloud and on-prem goes away. But 20 years from now, it might.””

Sharma said that financial services firms, for example, that train a model with private cloud might want a public cloud for deployment. He could see customers keep some data in house for training before deploying models on public cloud for inferencing. He hasn’t seen customers seek multi-cloud setups for AI yet, but those setups are a possibility to avoid vendor lock in.

What’s Next

As AI comes to more mobile and edge devices, Taglienti (pictured above) predicts a mini-data center approach for some retailers, manufacturers and health care companies, for example, in getting the most out of the technology. Hardware vendors need to continue to strive for cloud parity with technology that is simpler to implement, he said. Offerings such as the Dell AI Factoryare steps in the right direction, he said.

“If you don't do that, you're going to lose every time,” he said. “You'll sell lots of equipment to the Facebooks of the world and the Metas of the world … but that's a different problem than the enterprise. The enterprise is not going to make those kinds of investments in data centers. They're not going to hire a bunch of staff to manage it. So if you're selling a solution to the enterprise, you need an easy button.”

The solution providers agreed on not coming to customers with one vendor or one technology roadmap in mind for executing on a project, requiring partnerships with multiple vendors and even experimenting with AI internally to discover the problems it can solve and the ones it can’t.

“We want to make sure that when we show up, we are not biased,” Sharma said. “It'd be wrong for us to only understand one part of the answer and then show up and try to force that no matter what the question is.”