DataRobot’s Veeraraghavan Calls New AI Products ‘A Turning Point’ For Partner Engagements

‘It’s a huge, huge set of space for the channel to actually bring their unique capabilities,’ DataRobot Chief Product Officer Venky Veeraraghavan tells CRN.

DataRobot Chief Product Officer Venky Veeraraghavan sees the artificial intelligence applications and platform vendor’s recent spate of AI advancements as a way to “deepen” its relationship with partners and the type of AI services they can provide to customers.

In an AI era of point applications serving a single purpose and hyperscalers not allowing deep enough customization, the Boston-based vendor is structuring its platform so that partners and customers can layer subject matter expertise onto an AI app, Veeraraghavan told CRN in an interview.

“It’ll be a turning point for our engagement with partners,” Veeraraghavan said. “It’s a huge, huge set of space for the channel to actually bring their unique capabilities. They have deep knowledge of oil and gas or marketing or financial planning. And so they can take our general- purpose platform, specialize it and use it as an end solution for the customer that … has the DataRobot engine inside it. But it’s an application and services that the partner is providing to actually get it to solve the end problem.”

[RELATED: DataRobot Rolls Out New AI App Building Capabilities]

DateRobot Enterprise AI Suite

When asked to compare his company’s AI platform to products by AI leader Microsoft, Veeraraghavan used the analogy of Microsoft as a supermarket while DataRobot is akin to a meal service like HelloFresh.

Veeraraghavan worked at Microsoft for about 20 years before leaving in 2021 to join DataRobot. He left Microsoft with the title of vice president of product management for Azure Cognitive Services, according to his LinkedIn account.

“They own GitHub. They have a big platform. But because their customers are so broad, they really focus on the ingredients,” he said. “They’re more of a grocery store, and they have all the ingredients for you to make any meal you want. They want to say no to almost no one. Our approach is much more … Here’s the recipe. Here are the key ingredients. If you want to [add] more spice, that’s great. If you don’t want onions, that’s great. But it really gets to a good starting point.”

As for easy AI wins DataRobot partners should explore, Veeraraghavan said that SAP partners should look to DataRobot’s framework for building AI applications for users of the Germany-based enterprise software giant’s portfolio.

The two vendors have a close relationship, he said. “How can we build these great apps for Ariba? Or how do you build great apps for Concur? How do you build great apps for SuccessFactors? Those are all things that we’d love to have partners help us with. And that’s just one example. It could be for any given industry.”

Here’s more of what Veeraraghavan had to say about his company’s platform and the future for enterprise AI.

What do you want partners to know about how DataRobot is approaching this AI era?

AI is not getting to the end user. … In the enterprise, how do you get the AI in front of the end user, the businessperson who will … make a decision? … The definition of AI has changed in terms of who works on it. It used to be ML [machine learning] equals data scientists, Ph.D.s. … In GenAI, it’s quite different.

So much of the work is software developers, services folks working with subject matter experts, product marketing managers trying to figure out what is the voice of the customer service bot? … What words should they use? … It really is a subject matter expert task … [plus] data scientists because you still want people to measure the correctness of the model, etc.

And so pulling all that together is what we are going for with this release. [I am] super excited about the enterprise AI suite. … Our take on getting AI to the end user is the idea of AI apps.

How do you solve concrete business problems with either a UI or an integration into an existing business workflow, something like [Microsoft] Teams or a Slack or an SAP. … There’s also regulation, compliance documentation, testing to hit the controls and then making sure that it’s all documented so people understand it’s correct. And then how do you get that out and do mitigation as that thing is rolling out?

And then finally, how do you do this in a modern way? So how do you use DevOps? … There’s a whole AI life cycle that has to merge with the app dev life cycle. And so that has to merge. … DataRobot has been sort of well-known for its platform, enterprise-ready, works everywhere.

Then there are these apps that are coming out. But then they all need customization. So our approach is unique. … They’re not point apps. They’re not just a single thing like Writer or Copilot. It is a code template that really is sort of a GitHub repo that contains all the parts of the life cycle. … It’s going to solve a business problem, but it requires people to help customize that for the customer.

Some customers will have all the skills required to actually do it themselves. Their internal IT desk can do it. Some customers will depend on us to do it. But it’s a huge, huge set of space for the channel to actually bring their unique capabilities.

They have deep knowledge of oil and gas or marketing or financial planning. And so they can take our general-purpose platform, specialize it and use it as an end solution for the customer that … has the DataRobot engine inside it. But it’s an application and services that the partner is providing to actually get it to solve the end problem.

That’s really what customers are looking for. They’re not looking for a vector database. They’re not looking for millions of parameters or billions of parameters. They’re looking for a solution to their problem.

How is DataRobot differentiating itself from the growing number of AI vendors?

Our view of apps is it’s not just the UI. Most companies today, when you think about the platforms, they talk about UI. … Our differentiation is that today, when we talk about an app in this new release, we really think about the entire life cycle.

Here’s the data connection. Here’s the ETL [extract, transform and load process]. Here’s the semantic model. Here are the settings for your modeling. … You want to put that together.

And then … there’s also the app logic, which is like, if you want to build a marketing mix app … how much I want to spend on which channel, and I want to do an optimization across it. So that’s app logic. That is not really a predictive model. … You really want to pull all those together and then encapsulate them with all the operational goodies, operational logic around guard models, monitors, mitigations. … And describe that whole thing in code that a customer or a partner can customize to the very specific needs of the business. … That is our special take on that.

On … confidence, we really are the only ones who are right now building out how you document for these new regulations that are coming up. … There is a regulation, there are a whole bunch of controls in the regulation, and each control has one or more tests.

And so we provide a bunch of tests, and also the customer or the partner can build the tests. … Take the results of it, and then you put it inside a document that becomes your compliance doc. And so you stick that with every solution, every version of the solution, so as you change it, you have a full understanding of what’s changed.

That’s what I call ‘solve the problem at rest,’ when the ML is at rest. Then when it’s running, you also want to have—here are all the different kinds of models that are running to evaluate the inputs, evaluate the outputs, and then, when required, mitigate it in real time. … Often people build these AI bots that are based on the [Microsoft-backed, ChatGPT maker] OpenAI spec.

And so even if you have one, we can have a bolt-on governance that says, ‘Look, we’ll wrap it with two or three lines of code. And we’ll provide all this additional functionality so you can be more confident.’ … And] it’s fully integrated with DevOps.

So it really is a GitHub-centric world. So everything is in a repo. You clone it. You make it your own. And then it’s also all declarative. … You can say, here is an LLM (large language model). Here’s the vector database. You can change that on GitHub, and when you do the pull request, we’ll see a version has changed. It’ll kick off a bunch of actions. It’ll tell DataRobot to update its state. And suddenly, now you have a new version of the app that is now updated.

So it’s deeply integrated inside how developers work today. But then in the UI, a subject matter expert can work on that thing. So it’s a collaboration.

What are some big business opportunities ahead for DataRobot partners?

In AI today, almost all the profits and all the investments are at the bottom of the stack. So Nvidia, the hyperscalers, data centers.

But a well-developed technology stack, it is the other way around—all the profits and margins are on the applications of the technology. And then the infrastructure gets commoditized.

What I would say for partners in general is we’ve given a way in which you can really start solving for the end problem. … I want to save cash and keep my credit line low. Great. Then you want to build an app for it. You want to integrate it inside, inside an FP&A [financial planning and analysis] solution. And you want to build your own app. … They can focus on building that thing knowing that the platform and everything underneath it is abstracted out. … That is the grand opportunity. … It’s a much, much better starting point than figuring it all by yourself.

You’re taking all the expertise that we have had solving these problems for over a decade. And we ‘assetized’ them as these templates that really make it easy for you to … build a time series model, [for example]. … The last two years, we’ve been actively working with hundreds of customers on generative AI. … Here are ways to build a digital assistant that can be modified exactly to your needs. … In the end, Copilot is Microsoft’s UI.

You can do some tweaking, but you can’t build your own models in there. And so here … so much of the value is actually in the business-specific data and the business-specific apps, the business-specific integrations. And that really opens up with this model of doing AI.

What’s the future hold for your platform?

This is our first release of this app dev platform. So there’s a lot more work to do. We’re going to be massively working on what does the SDK [software development kit] look like? … There is a lot of finesse we can build on the app development framework. … There are a whole bunch of new problems around, how do you describe agents?

It’s an early stage thing. We have some support. But it’s a fast-moving space. How do you make sure that, instead of just having UI apps or integrations with existing business workflows, [you have] novel things? … How do you make sure it’s really easy to build and maintain and run over time? How do you do tracing across multiple tools and making sure that’s available to the customer?

So those are all problems people are going to run into. And that’s what we're beginning to think about—how do we build that into the platform?

How is user adoption of AI going to look different in 2025 compared with 2024?

As we pass the peak hype cycle, we are coming down the trough of disillusionment, we’re finding things are actually working. … There are use cases that are working. They are providing value.

What we are doing is taking those and we are amplifying them. So it’s much easier to talk to a customer … Start with these five [use cases]. Or start with these 10. These are well-known places where things can help.

And then what you can do as a channel partner, you can start saying, ‘What is it specific to you?’ … What I see is overall acceleration of adoption and less generic experimentation and more experimentation on how do I make it work for me?

And then getting into production because a lot of these confidence issues are addressed. I do think sales cycles will come down, conversion ratios will go up because we are now not generally talking about, how do you do AI … but into more interesting [work].