Snowflake CEO Ramaswamy: Partners ‘A Positive Force’ In Company Growth

‘We work with a large set of partners, some of them have expertise in specific areas. I’ve met partners, for example, that are very good at certain kinds of migrations, while some of the lar\ger GSIs bring just enormous domain knowledge, say, in the finance sector or in taxes,’ says Snowflake CEO Sridhar Ramaswamy.

Snowflake CEO Sridhar Ramaswamy told CRN that he has met with the heads of Snowflake’s top 10 partners since his appointment as CEO in February and sees global systems integrators (GSIs) and systems integrators (SIs) as “a very positive force as our presence in large companies grows.”

“This is a year in which our SI partners—definitely the GSIs—are stepping up in a pretty big way,” Ramaswamy said in response to a CRN question during a question-and-answer session at Snowflake’s annual Data Cloud Summit event. “We work with a large set of partners, some of them have expertise in specific areas. I’ve met partners, for example, that are very good at certain kinds of migrations, while some of the lar\ger GSIs bring just enormous domain knowledge, say, in the finance sector or in taxes.”

Although Bozeman, Mont.-based Snowflake already has strength in the financial services sector, adoption and deployment of key applications like those for anti-money-laundering reporting is “often driven by the expertise that one or more SIs bring to the table.”

RELATED: Snowflake Data Cloud Summit 2024: The Coolest Vendors]

Snowflake Data Cloud Summit 2024

Snowflake Data Cloud Summit 2024 runs through Thursday in San Francisco.

Ramaswamy told CRN that the vendor has “made substantial investments in our partnership team,” including the hiring of former Salesforce partner executives Tyler Prince and Amy Kodl as senior vice president of worldwide alliances and channels and vice president of services alliances, respectively.

The new CEO added that Snowflake is making investments in its team in India “because so much of the work on a lot of these projects are done” there.

On CRN’s Solution Provider 500 list, Tata Consultancy Services ranked No. 2 and Wipro Technologies ranked No. 5. Both are based in India.

“A lot of our customers are also moving not just a small part of the IT organization, but entire department decision-makers are … over [in] India,” Ramaswamy said. “And so we are pretty deeply invested in this space.”

During the event, Snowflake revealed deeper partnerships with some of the leading vendors in AI, including its adoption of Nvidia AI Enterprise software to integrate NeMo Retriever microservices into Snowflake Cortex AI, its fully managed large language model (LLM) and vector search service.

Snowflake Arctic was also upgraded to fully support Nvidia TensorRT-LLM software and to exist as an Nvidia NIM inference microservice, according to the vendor.

Here is more of what Ramaswamy had to say during his various appearances at Snowflake Summit 2024.

The Future Of AI Models

No one can predict what foundation models are going to be 10 years from now. But I don’t think they obviate the need for a real platform.

I think that having data that you can trust, bringing it into a system that understands the governance and making it available carefully within the enterprise—there’s not a magic answer that a language model is going to have for things like that. … It’s also important to emphasize that AI and language models are incapable right now of solving analytical problems.

In other words, if you want to, let’s say, come up with a fraud score, that’s not a language model's job. You have to construct a model to do that. And yes, explainability [knowing how a model got to an answer] there is an issue.

Language models, absolutely, have reasoning abilities … where the model knows which of these underlying prediction models to involve. So those problems will continue to exist. … At Snowflake, we are very focused on things like how do we make it really easy for people to create, let’s say, RAG-based chatbots. … Or how they can make existing data pipelines more efficient, just like we want to make software engineers more efficient. … I see the opportunity.

But, 10 years from now, it’s really hard to tell. And I’m not quite as fearful in terms of how can you use these models in a responsible way.

Is Cloud Too Expensive?

It’s possible to misuse every piece of technology. … That’s nothing new. We’ve all heard of massive investments in on-prem infrastructure.

Or, God forbid, I’m sure you've been part of companies that bought the wrong box. And you’re like, ‘Oh my God, we have this wrong box. And what do we do?’ Because you can’t change anything about this delightful box that you bought.

Part of Benoit’s [Dageville, (pictured right) Snowflake co-founder and president of the product division] vision was to make this much more fluid in terms of their [users’] investments.

Having said that, we absolutely pay attention to what value we are driving with customers. We understand that spend that is out of proportion with the value that we deliver is a very dangerous place for us to be because sooner or later, people look at it and they go, ‘That is not that efficient.’

In fact, the best of our customers, ones that are spending tens of millions of dollars with us every year, they actually have an internal governance process for how they will approve new projects.

They make any new internal client go through a template that talks about things like what's the size of the data? What's the business function being solved? How much money are they spending currently? What is the projected spend going to be?

And, in fact, they force the finance person for these teams to sign off on the proposal before implementation even starts. … We have a pretty big value engineering team that focuses on nothing but this.

We also pay attention to signals like how often our customers have gone through and optimized their whole compute environment with Snowflake.

And what I can also tell you is we have customers that have lived in the on-prem world and have now migrated to Snowflake. And they tell us, ‘I can have this argument with my CFO all day long. I can explain to them what it costs to stand up a data center, put a bunch of boxes into it and then try to amortize that over five years.’

It all comes down to efficient implementation. And we want to be very active partners with our customers. And finally, we also continuously … make compute more efficient on Snowflake.

We have something called the SPI—the Snowflake Performance Index—where we basically measure the increased throughput for the same dollars that we do our customers year on year … if I recall correctly, it's been something like 27 percent improvement in how efficient we are.

This is absolutely a topic that we care about. But I think the one way that we can combat this with customers is to be an active partner with them in driving value.

Snowflake’s Expansion Into The Manufacturing Vertical

We get started with manufacturing customers in the context of powering things like their warehousing applications or their data during workloads. And now, of course, with AI. But we very much see a world in which people are writing applications on top of Snowflake. … Siemens, for example … talks about how we power the vast majority of their internal data pipelines.

But really, where the future is headed for them is for them to be able to create connected apps where devices are sending back telemetry data, typically to a Snowflake instance.

And then they can create additional value-add applications for their customers. For example, for things like preventative maintenance. And so we are beginning to see trends like that where Snowflake is used as the data platform.

We also have a very large partnership with Blue Yonder … in the supply chain space. And there’s a similar model where they have functionality built on top of Snowflake.

And then we often begin to work with these customers on how these apps go to market. But our larger vision is to work with customers to help them create additional monetization opportunities.

This is largest in the finance, media, even retail sector, where there’s a lot of data.

But we increasingly see us as being the platform on which these kinds of manufacturing applications get created as well.

And that’s exciting for us because we go from being an expense line of needing to do data analytics and data engineering for a company to more of a top-line position because we are directly driving revenue for that.

And in many of these cases, it’s pure margin for them because it is additional revenue that they didn’t have. It’s a natural evolution and part of the reason why we call ourselves the data cloud.

Product Innovations

The foundation layer for AI at Snowflake is something we call Cortex AI, which is a model garden [model library] as well as semantic search but built into every Snowflake deployment.

Part of the magic of Snowflake … is that we tightly integrate things like AI and Cortex AI into every front door … which means that if you know how to write SQL, you know how to call a language model.

If you want to do sentiment detection, you want to do summarization, you want to do translation or you want to write a good custom prompt to extract some data from a piece of JSON [JavaScript Object Notation], all of that you can now do just by writing SQL. … Similarly, right now, if you wanted to build a chatbot, let’s say with access control.

You want a chatbot on a bunch of documents but you want access control to be respected. And the way you would do this is you first get that data somewhere in cloud storage. You would use one of the various vector databases that there are to index it.

And then you would get yourself an API key and perhaps an OpenAI or Anthropic contract, and you would stitch together this application maybe using LangChain [an open-source framework for building applications based on LLMs].

That’s a lot of work. What we’re able to do is you’re on one command, you create the semantic index, say on an Iceberg Table. And then you can take one of our built-in Streamlit [an open-source framework for machine learning and data science applications] apps and create a chatbot. We think of this as a five-minute operation.

And that’s the kind of power that we want to unlock. And, by the way, this app will respect the governance rules that you have … because whatever access it does already obeys the governance rules that are there in the platform.

This is part of what we mean by enabling all kinds of people within an enterprise to get and act on data.

Document AI, which you can think of as roughly a bridge between unstructured and structured data—think about extracting numbers from contracts and, perhaps, the total from restaurant receipts—an analyst that does not even know SQL, a business analyst, as long as they know where to point this application to, can set up a Document AI project that now can extract structured information from these kinds of unstructured data and set it into a table so that somebody else can analyze that data.

This is part of what we do in terms of just making AI super easy, super efficient but also broadly accessible to everybody within the enterprise so that all kinds of people can build, legitimately, applications.

It is one thing for you and me to say we can use a chatbot. It is a different thing to be able to say in five minutes you can build a chatbot.

On Predicting ‘Early’ And ‘Often’

My thing when it comes to technology predictions is that you should predict early and predict often and generally one of them becomes true.

But kidding aside, I think it’s really important to acknowledge that we are pretty early when it comes to how language models are going to be used. … Financial analysts will ask me a lot, ‘Hey, what is margin going to look like for an inference business? … We want to have a large inference business so we can worry about margin. I’m the first person to say that.

But having said that, just look at the layers of changes that are going to happen if you’re talking a two- or three-year timeframe.

In the next two to three years—it’s an easy prediction, a can’t-miss prediction—that there are going to be multiple providers of hardware, for example, to do inference.

Absolutely there is Nvidia. But there is Groq and there is Intel and there is AMD … obviously, DPUs [data processing units] … so competition at the hardware level.

Now let’s look at the next level of cloud service providers. … The CSPs carrying these various pieces of hardware, but also there are companies like CoreWeave that are going to be carrying it because there is such a shortage of chips that they’re seeing a business model. …

Obviously, there are people that are aiding CoreWeave. They raised a round. … Layer No. 3, model innovation is happening at a very rapid pace.

And part of the magic of Snowflake is that we give you the right model for the job. Meaning if you just want to do sentiment detection, you do not need a 100 billion parameter model. A 2 [billion], 3 billion parameter model will do just fine. … We all tend to focus on the highest end, [Anthropic’s Claude 3 Opus] and [OpenAI’s] GPT-4.

But there is lots of innovation in the lower space. … Layer four is the deployment model for GPUs and is not yet fixed. … And my easy prediction is that there are multiple factors of change in each of these layers.

But having said that, there are also more capable models that are getting much, much bigger. I mean, let's face it, GPT-4 and [Claude 3 Opus] are much better in any chain-of-thought [prompting, where AI is guided to breaking down problems into smaller parts] than most other models that have been made.

And we don’t know if GPT-5 is going to be 10 times the size and a whole lot better. ... I can point to five things and say all of these will imply a 2X to 5X drop in prices. But I can also point to future research and say maybe there will be much bigger models that can do many more interesting problems. Which is why I go back to predict early, predict often.

Exploding Snowflake Data AI Cloud Use

We have thousands of customers exchanging information every single day on AI Data Cloud. The number of data-sharing relationships and value created is enormous.

And we are serving 5 billion queries every single day. That is almost the scale of the number of daily Google searches that happen on the planet. … And we are handling massive workloads from the biggest companies.

There are 200 trillion rows in our largest customer table. Snowpark, which enables us to bring the power of Snowflake to Python users, is taking off. Fifty percent of our customers are now using Snowpark.

Ramaswamy’s Background

I’m relatively new in this job as the CEO of Snowflake. It’s been three months, even though it sometimes feels like three years in a good, fun kind of way.

And for the last year, I led Snowflake’s AI strategy and innovation before becoming CEO. And I’d been a big fan of Snowflake for many years prior to that.

I worked in data all my life. I grew up in a … neighborhood in Bangalore. Science was taught to be the true way forward.

Of course, I had slightly different aspirations. I was passionate about drumming. But my parents were like, ‘No, no, no, no, no. You need a better career than that.’

So instead of pursuing my musical career, I did the only other natural thing. I got a Ph.D..in databases from Brown University. And I worked … at Bell Labs for quite a few years before getting the startup bug and working on an analytics firm in the dot-com era. … I then spent a very long chapter building and leading the ads business at Google.

But I learned what it requires to combine technology, customer utility and amazing people to create an industry-leading business.

Now, there’s no company in the world that is more focused on combining great technology with customer utility than Snowflake.

Plans As CEO

As the new CEO, it is sometimes useful to start with what’s not going to change. First and foremost, thousands of Snowflakes around the world are going to continue to be obsessed with your success. We put customers first. We stand behind the product that we create.

Our Net Promoter Scores are through the roof. And that’s what we aspire for day in and day out. … We measure our success based on your success.

The second thing that is not going to change is our commitment to the hard work of building a single, unified platform. It's easy to deliver a disjointed set of services and tools for every job. Just keep shipping all manner of things and leave the hard job of integration to you.

At Snowflake, we don’t do that. Snowflake delivers a unified platform that makes the complex simple, sophisticated and cost-efficient. … Over the past several years, we have made major foundational investments to support much broader use cases that you have been asking us for.

But we have done the difficult job of engineering them to deliver them as part of the same core platform that just works. … At the core of AI Data Cloud is, of course, data. And we have expanded the universe of data that Snowflake can operate on.

Now, Snowflake’s governance and performance extends to external data lakes with open table formats like Apache Iceberg and to transactional data.

We have connected every silo of data that your business cares about, whether it is internal or external. And we have provided the ability to discover, access and collaborate across your ecosystem.

And to complement this data foundation, we have delivered Cortex AI, our fully managed service that puts generative AI into the hands of every Snowflake user.

And then leveraging this intelligent and connected core, you can build and run modern applications of the kind that are not even possible in the traditional SaaS model of building software on the cloud. … Every person in your organization from sales to marketing to finance can use apps that deliver information to them through conversation, not code.

The AI Data Cloud is lighting up every corner of the enterprise. And we are just getting started.

Faster Product Delivery

What is changing? We are accelerating product delivery. We know that while you appreciate previews, you want things in GA [general availability] so you can send them into production. … We are permanently accelerating our pace of delivery. You can expect rapid-fire delivery from us going forward.

And this matters, especially in AI where innovation cycles run in weeks, maybe months. Certainly not years. And Snowflake is now driving that pace across the entire platform.

Over the past year, I’ve met with over 200 customers. And the No. 1 topic that comes up in these meetings is how we, Snowflake, can help with AI. … AI is opening up enormous possibilities because for the first time every person in the organization can talk to their data in fluid natural language.

And in just a few years the norm is going to be that we are going to be able to tell every app what we want and it is going to understand us, accent or no accent, which I think is pretty amazing.

But here’s the issue. The bar for AI in the enterprise is much higher than that for consumer use. It has to be reliable. It has to be trustworthy. And while on one hand, it’s tremendously exciting to contemplate the possibilities that AI creates, it is also scary. Because consumer AI is not ready for business use cases.

It’s not good enough for your customers to get the right answer to their support question half the time. And yet, that’s where consumer AI is. … People look at demos, say it looks great, but they say, ‘I want to make sure that I’m getting my money’s worth before I go and invest millions, if not tens of millions, into AI.’

Every rational business leader should ask that question. Of course, I do. … We are pioneering the era of enterprise AI. It is easy. It is efficient. It is trusted.

In a technology revolution, true mass adoption happens once things become easy—when things go from the hands of the chosen few to the hands of many.

And with the AI Data Cloud, it has never been easier to understand, easier to build, easier to reinvent. We are doing that work for you. And the next 10 years are going to drive so much innovation that they will dwarf what has been done over the last 20, even 30 years.

We know data and we know enterprise requirements. We have an amazing engineering team of thousands of people across the globe that understand what it takes to build an enterprise-grade platform.

To complement that data foundation, we have assembled one of the leading AI research teams in the industry.

Our AI research team brings together hundreds of engineers into six key disciplines required to deliver enterprise AI. … This team is working at breakneck speed to deliver easy, efficient and trusted AI to all of you. And of course, it is built into our unified platform that powers the AI Data Cloud.