Menu

Who cares about storage?

desk and chair in a room
Table Of Contents

Share this page

Chris Gondek
Chris “Gonzo” Gondek

As more workloads move toward infrastructure and platform as a service, and more applications are developed to leverage infrastructure as code, you may wonder, “Who cares about storage hardware or infrastructure anymore?” The reality is that most folks overlook storage infrastructure when thinking about their future in cloud. That is, until they realize just how important storage is to everything digital, regardless of the environment. That’s because we quickly forget that all databases, applications, and workloads are made up of and generate data. And you can’t have data without storage, even in the cloud.

The storage infrastructure (or increasingly, the storage service that provides the data services) lays the foundation for multiple outcomes for your workloads and data. The storage infrastructure dictates what features and functions you get, how much performance you can achieve, how much resilience and durability are possible, how many efficiencies you can realize, how far you can scale with flexibility… and more.

Unfortunately, making the wrong decision with storage early on has significant ramifications, typically noticed only when it’s probably too late to make cost-effective changes. You may encounter scenarios like not being able to scale, requiring “forklift upgrades”; not providing multiple connection mechanisms and protocols, requiring additional and different solutions with their own administrative overheads and costs; or not being able to meet performance, availability, and recoverability SLAs. And increasingly, not having a realistic cloud option.

It's also extremely important to have a flexible consumption cost model so that you don’t get locked in commercially and have the freedom to allocate your data resources across a diverse ecosystem of multiple infrastructure environments.

I want to point out what your “best foot forward” is when considering the adoption of any ecosystem where you’ll need to ensure the best outcomes for your most mission-critical (and effectively infrastructure agnostic) asset, your data.

In a previous blog post about the concept of data immortality, I described how we at NetApp are working toward that goal through omnipresent, multiprotocol, secure, efficient, immutable, resilient, and scalable data management, known as NetApp® ONTAP®.

Having these capabilities built in to ONTAP has significant knock-on effects and gives you that best foot forward down the line, starting with the humble beginnings of provisioning some data storage for that first workload. These outcomes can only be realized when we observe the sum of their parts, in no particular order, but typically in this sequence throughout the lifecycle of a workload’s data needs.

Omnipresence versus heterogeneity

We start with the “omnipresence,” which is very different from a “heterogenous” storage service. Heterogeneity means that a service can work in multiple environments. No ONTAP instance is omnipresent when it already is in multiple environments. Specifically, as a first-party service of the hyperscaler (not in the marketplace, but in the cloud portal or console itself, just like all the other “cloud-native” services in that hyperscaler). This is particularly important for those who are looking to adopt VMware in cloud solutions. You see, (at the time of writing this blog) the only independently scalable storage service for VMware in the cloud is the NetApp NFS datastore support provided by the NetApp ONTAP based first-party services we just spoke of. This also helps to reduce risk by ensuring freedom to choose and no lock-in.

Multiprotocol support ensures a wide coverage of data services to multiple systems and use cases, from a smaller, leaner footprint, which means future-proofing as well as versatility and freedom of choice for application developers and users.

Security has arguably become the most important factor that impacts technology decisions and adoptions. ONTAP has built-in ransomware detection and prevention, and it protects against cyberattacks in real time, so security is also addressed. Not to mention the highly efficient and immutable NetApp Snapshot™ technology that provides a logical air gap for recovery assurance as well (yes, in the cloud too).

All of these benefits come without compromising performance. In fact, applying efficiencies like compaction, compression, deduplication, thin provisioning, and tiering creates a smaller footprint that is more cost effective (consumes less physical disk surface area, which is what cloud bills you for consuming) and is also inherently faster performing. These efficiencies are maintained when creating Snapshot copies or replicating between environments for migration or disaster recovery.

Oh, and for the flexible cost consumption model across all of what we just spoke about, NetApp Keystone® Flex subscription facilitates the spend reallocation through a digital wallet that allows you to scale and distribute your consumption on your terms, on premises, in the cloud, and across data services, all through the same single user experience.

Put your best foot forward from day one

So if you’re considering:

  • VMware in cloud. Remember that storage scales radically differently than compute and memory.
  • Deploying in Kubernetes. Persistent data (the residual asset) still requires storage.
  • Operating in a hybrid model. Typically requires omnipresence of storage.
  • Migrating to a cloud. Most of the activity is data movement from storage to storage.
  • Backup and restore modernization. Backup or secondary storage typically constitutes 66% of all storage (for every front-end terabyte, there are typically 2+ back-end terabytes).

Put your best foot forward today, and care about your storage, because that’s where your data lives! And if you do it right, you can start realizing all the benefits from day 1, and continue to increasingly leverage the combination of capabilities to get more out of your data, anywhere you want, for less cost and with less risk.

To learn more about the importance of the right storage in the cloud, check out my Newsworthy Minute episode with VMware on how NetApp is helping VMware Cloud customers with their storage needs.

text with radio button

Chris “Gonzo” Gondek

Data Driven Technology Evangelist, NetApp ANZ

Techie with Table Manners

My mission is to enable data champions everywhere. I have always been very passionate about technology with a career spanning over two decades, specializing in Data and Information Management, Storage, High Availability and Disaster Recovery Solutions including Virtualization and Cloud Computing.

I have a long history with Data solutions, having gained global experience in the United Kingdom and Australia where I was involved in creating Technology and Business solutions for some of Europe and APAC’s largest and most complex IT environments.

An industry thought leader and passionate technology evangelist, frequently blogging all things Data and active in the technology community speaking at high profile events such as Gartner Symposium, IDC events, AWS summits and Microsoft Ignite to name a few. Translating business value from technology and demystifying complex concepts into easy to consume and procure solutions. A proven, highly skilled and internationally experienced sales engineer and enterprise architect having worked for leading technology vendors, I have collected experiences and developed skills over almost all Enterprise platforms, operating systems, databases and applications.

View all Posts by Chris “Gonzo” Gondek

Next Steps

Drift chat loading