Data protection has always been considered a “necessary evil” of IT investment because it can be used only for one purpose: to protect your applications and data from imminent threats. The massive evolution in application development, big data analytics, Internet of Things (IoT), virtualization, containerization, and so on, has further increased the complexity of the primary production environment. This evolution also drives additional spending in secondary infrastructure for the very same fundamental goal of ensuring the reliability and availability of your business.
This paradigm seemingly represents an endless loop of the “chicken and egg” situation, with double, if not triple the IT spending that’s rapidly increasing with no real solution in sight. It requires fundamental change, not just in data protection practices and technologies, but also in the overall philosophical outlook of the IT evolution and strategy.
To break this dependency and to reverse the trend, NetApp introduced the concept of the Data Fabric. A data fabric is an architecture and a set of data services that give you consistent capabilities across your choice of endpoints that span on-premises and multiple cloud environments. This concept has enabled users to revolutionize data protection services as a native extension of the primary infrastructure. You can now extend use cases beyond traditional data protection to multicloud and hybrid cloud data availability – newly evolving use cases that enable customers to further leverage their data protection investment.
Alex Goldblatt is Sr. Product Manager with the NetApp Data Protection Partner Product Management Team. Alex has over 20 years in IT Infrastructure industry focused on Hardware and Software strategy and product development. Prior to joining NetApp, Alex worked for variety of startups and large enterprise companies such as BMC Software, Dell EMC and Veritas.