NetApp Expands Hardware, Software, AI With Eye On Intelligent Data Infrastructure

‘We’re going to help customers unleash the power of AI with their data, helping overcome the challenges of AI and data gaps, and be able to do that without creating a net-new data silo. So we’ll be going much deeper on the AI front,’ says Sandeep Singh, NetApp’s senior vice president and general manager for enterprise storage.

NetApp Tuesday opened its annual NetApp Insight conference with an expansion of the storage and cloud technology company’s intelligent data infrastructure.

That intelligent data infrastructure brings together unified data storage, integrated data services and AI-powered CloudOps to manage data seamlessly across any infrastructure, said Sandeep Singh (pictured), senior vice president and general manager for enterprise storage for the San Jose, Calif.-based company.

NetApp is taking a four-pronged approach, starting with unified data storage that includes new technology to help customers modernize their block storage environments at every budget, Singh told CRN. Also included are improvements in cyber resiliency to help not only detect ransomware attacks in real time but also recover, as well as advances across NetApp’s public cloud storage offerings and BlueXP.

[Related: NetApp CEO: We Are ‘Solving The Problems For The Era Of Data Intelligence By Bringing AI To Your Data’]

“And we’re going to help customers unleash the power of AI with their data, helping overcome the challenges of AI and data gaps, and be able to do that without creating a net-new data silo,” he said. “So we’ll be going much deeper on the AI front.”

For unified data storage, NetApp has a solid portfolio of NAS and unified NAS-block systems across its high-performance flash, capacity-optimized flash and hybrid flash systems, Singh said. The company also offers block-optimized ASA systems specifically targeting customers with stand-alone block environments, he said.

To help simplify those environments, NetApp is expanding its ASA series with three new models: the ASA A70, A90 and A1K, he said.

“They’re simple so that anybody can manage it, starting off with being simple to deploy in minutes, provision in seconds, protect with one click,” he said. “They pack a punch so you can accelerate VMware database applications. They bring in all the intelligent data management capabilities and proven reliability and are affordable.”

ASA A series’ simplicity comes from its architecture, which features an underlying common storage pool or a global storage pool on the back end, while on the front end customers don’t have to manage any of the logical layers, Singh said.

Network configuration is automated for quick deployments, he said. “You just have to know the number of LUNs you want to present, the capacity and the host to which to present it, and that’s it,” he said. “It takes seconds to provision from a protection standpoint. With one click, customers can protect data using snapshot-based backups. … We’ve automated a lot of the upgrade prechecks to simplify the environment end to end as well.”

NetApp’s ASA family of block storage arrays is a great product line, but not necessarily just because of the performance for performance’s sake, said Ned Engelke, CTO at Evotek, a San Diego-based solution provider and NetApp channel partner.

“When we engage with our NetApp customer environments that have any kind of scale and any kind of age on them, we often see opportunities to take a fresh look at how to architect it,” Engelke told CRN. “You know how tempting it was to create tons of silos, especially 20 years ago or so, when you had to really focus on performance. Well, some of that architecture has carried forward until today. So when we can find opportunities to rearchitect, to get customers to use just the right amount of the performance capability of these things, that affords them a better economic outcome.”

NetApp is also expanding its secondary storage capabilities for data backups and archiving with two new models in its FAS hybrid flash unified storage line, the FAS70 and FAS90, which Singh said have three primary customer use cases.

“The first is about data tiering to lower the cost of data over the data life cycle,” he said. We’re finding 60 percent or more of the data in customers’ environments may be cold data. We have built-in automated and granular tiering so customers can seamlessly lower the cost of data. Second, it can be used as a backup target environment for fast systems and yet provide the performance for recovery workflows. And third, it provides ransomware protection.”

When it comes to cyber resiliency, NetApp has been very focused on providing integrated ransomware detection directly within its storage for last several years, said Jeff Baxter, vice president of product marketing for the company.

For the past six months or so, NetApp has been previewing enhanced autonomous ransomware protection with AI and machine learning models to basically create the next generation of ransomware detection built directly into enterprise storage, Baxter told CRN.

“At NetApp Insight, we’re now making this technology generally available,” he said. “It’s been testing out very well in tech previews with customers, so we’ll be releasing this as a GA offering to customers. It will provide file-based, real-time, AI-powered ransomware detection. This will allow customers to auto-update it without Ontap operating system updates. It will just basically be constantly kept up to date. We’re constantly getting feeds of malware from different industry sources, training our machine learning model, and allowing customers to update that ML model to fight ransomware in real time.”

Customers for now will have the option to accept automatic updates, but NetApp will be exploring whether to make those updates automatically, Baxter said.

“We have an integrated ransomware protection dashboard in BlueXP, which is our unified hybrid control plane,” he said. “We can now take actual classification information about data that we’re able to gather to determine if data has PII [personally identifiable information] concerns, such as Social Security numbers, credit card numbers, things like that, and expose that through BlueXP as well as through services like ransomware protection service. Customers can both see the risks to their data sets and which data sets would cause the most issues if they were not protected properly.”

That ransomware protection also includes integration with SIEM (security information and event management) tools from such sources as Amazon Web Services Security Hub and Splunk via BlueXP to alert security administrators to potential issues and, via BlueXP, start follow-ups such as restoring impacted data, Baxter said.

When Evotek sells NetApp technology to help a customer with ransomware protection, it includes all the vendor’s capabilities in a programmatic discussion not focused on the efficacy of any individual feature but as part of a larger conversation, Engelke said.

“What I love about NetApp is that they wrote their own file system, and they have the ability to deal with the metadata that comes out of it,” he said. “They use that metadata for ransomware protection. It’s not something we lead with because it’s like building a race car. Here’s a tire pressure gage. Well, cool, that helps. I can’t do without it. But maybe there's more, right?”

On the cloud storage front, NetApp already has first-party native cloud storage partnerships with Amazon Web Services, Microsoft Azure and Google Cloud, Baxter said.

With Google Cloud NetApp Volumes, the company is now making its Flex service level, which is its smallest starting level that scales to small workloads on small storage pools, available across all 40 Google Cloud regions, he said.

“We’re also making it so that our premium and extreme service levels let customers scale up to a petabyte in capacity and up to 12.5 GBs per second of throughput for extremely high performance on Google Cloud NetApp Volumes,” he said. “We’re also introducing that auto tiering capability we have with Azure on the Google Cloud NetApp Volumes for the premium and extreme service tiers there as well. We are doing some things in BlueXP specific to fleet management, with the ability to roll out end-to-end Ontap updates across your fleet.”

NetApp, while remaining a good VMware partner, has customers looking for alternatives and will help refactor their virtual machines to run directly within the cloud, Baxter said.

“We’ll be rolling out a migration assistant tool to help migrate from VMware to AWS EC2 running on top of NetApp or Amazon FSX for NetApp Ontap,” he said.

To help businesses looking to prepare their data for use with AI and intelligent data services, NetApp will be helping them federate their information into a single global metadata namespace that doesn’t require them to deploy new software or catalog their data separately, Baxter said.

“We track all the metadata about customers’ data in their NetApp systems already, so we can stitch it together to provide a global metadata namespace for them they can use to explore the value in their data,” he said. “And we’ll be introducing things like a Data Explorer within BlueXP that will allow them, assuming they have the right access permissions, to ask questions of their data and conduct natural language searches to find data.”

Testing is underway by Nvidia of NetApp Ontap for SuperPOD with Nvidia, Baxter said.

“Today, we have SuperPOD configurations with our E-series for high-performance computing, and we have a BasePOD certification for our NetApp Ontap software,” he said. “A lot of customers love the E-series solution for high-performance computing but want a solution that provides the data management features of Ontap with the high performance necessary for SuperPOD.”