Sign in to my dashboard Create an account
Menu

NetApp's evolving AI GTM

Contents

Share this page

hoseb-dermanilian
Hoseb Dermanilian

We're entering another exciting fiscal year at NetApp, and I'm thrilled to continue our AI journey with renewed focus and responsibility. It's hard to believe that it's been 6 years since we embarked on this path, and I can still vividly recall our first AI customer, a large healthcare research center in the United Kingdom. Back then, we enabled NetApp AFF systems running NetApp® ONTAP® software to be connected with NVIDIA DGX-1. Fast-forward 6 years, and Jensen Huang’s mention of our partnership at NVIDIA GTC 2024 was a powerful testament to the progress we’ve made together. 

Jensen Huang at NVIDIA GTC 2024

During our recent NetApp INSIGHT® event, we proudly showcased how Lockheed Martin built their AI Center of Excellence using NetApp technology and NVIDIA BasePOD. The question arises: Why did they choose NetApp? The answer is simple: Data is invaluable, and managing it requires enterprise-level capabilities. More important, you need a trusted partner to handle your data. Period. 

Announcing the AI Sales Specialist team

As the AI market evolves at lightning speed, we want to ensure that our go-to-market strategy aligns with NetApp’s offerings and core value proposition. We aim to remain our customers’ and partners’ trusted advisor in this space, guiding them and actively listening to their needs and aspirations for our mutual businesses. With this goal in mind, we are expanding our Global AI Sales Specialist team. This team will work very closely with our esteemed partners and customers to position NetApp’s best in class solutions in the AI space partnering with NVIDIA and other key industry players in the market. 

Evolving our AI go-to-market strategy

In the era of predictive AI, our primary focus revolved around model-training workloads, leveraging ONTAP certified architectures with NVIDIA. However, with the rise of GenAI, we’ve witnessed the market shift toward customers building AI Centers of Excellence to meet diverse workload requirements. These requirements include training their own models, fine-tuning pretrained models, using techniques like retrieval-augmented generation (RAG) to make models more relevant to their businesses, and performing inferencing on existing data or generating new data with GenAI models.  

The key factor here is the enterprise-level maturity of the data management system. Because these workloads involve proprietary data, whether on premises or in the cloud at a large scale, customers prioritize security, performance, enterprise-level data management tools, and data mobility above all else. Thanks to NetApp ONTAP and our investments with AWS, Google Cloud, and Microsoft Azure, we have the strongest market positioning to help customers build these AI Centers of Excellence.  

Our solutions provide end-to-end support of AI workflow within a single platform that’s high performing, secure, efficient, and cloud connected. Our BasePOD reference architecture, in collaboration with NVIDIA, along with NetApp cloud first-party offerings, has been at the forefront of our customer engagements. 

Additionally, we’ve observed a need among customers for ultra-high-performance clusters to build and train foundational models. For these customers, performance is the most critical factor when they’re selecting AI infrastructures. To address this need, we’ve deployed multiple SuperPOD reference architectures with NVIDIA, NetApp E-Series systems, and the BeeGFS parallel file system, supporting InfiniBand networks at multiple customer sites. These architectures have been instrumental in supporting AI applications in industries such as oil & gas and genomics, and in various other use cases. 

Industry partnerships and our commitment

Furthermore, our industry partnerships and technology validations with NVIDIA, Domino Data Lab, Run:ai (acquired by NVIDIA), Vertex AI, Amazon SageMaker, Amazon Bedrock, and Microsoft Copilot empower our customers and partners to make informed decisions about their AI stacks, eliminating the need to navigate complex integration points. 

Finally, we remain closely attuned to how the market is evolving, engaging in ongoing discussions with our trusted enterprise customers and partners to ensure that our product roadmaps align with their AI data needs. By staying in sync with their requirements, we can continue to deliver solutions that propel their AI initiatives forward.  

For more information about NetApp AI solutions, visit the NetApp AI solutions page.

Hoseb Dermanilian

Hoseb joined NetApp in 2014. In his current role, he manages and develops AI and Digital Transformation business globally. Hoseb's focus is to propose and discuss NetApp's value add in the AI and Digital Transformation space as well as helping customers build the right platform for their data driven business strategies. As part of the business development, Hoseb is also focused on developing NetApp AI channel business by recruiting and enabling the right AI ecosystem partners and enabling Go-To-Market strategies with those partners. Hoseb is coming from a technical background. In his previous role, He was the Consulting System Engineer for NetApp’s video surveillance and big data analytics solutions. Hoseb holds a Masters degree with distinction in Electrical and Computer Engineering from the American University of Beirut and he has multiple globally recognized conference and journal publications in the field of IP Security and Cryptography.

View all Posts by Hoseb Dermanilian

Next Steps

Drift chat loading