Progressive organizations desire to be data driven. They need to be able to rapidly locate, access, and process their organization’s data to fuel their insights and decision-making processes. Becoming data driven is a dynamic challenge that becomes more difficult as an organization evolves.
Data growth continues to be the top storage challenge for organizations, and this problem will only get worse; many organizations are seeing their data under management grow by nearly 30% every 12 months. Data growth is a common trait among progressive organizations that are constantly creating new applications and business models to fuel their growth. However, uncontrolled data growth can derail their transformation efforts while making data management and data protection more difficult.
Workload portability is a key enabler for data-driven organizations because it allows them to take advantage of various execution venues to get a job done, across multiple public clouds and even on-premises data centers, when resources are available. Data movement should be a key consideration for any organization that is looking to leverage multicloud or hybrid cloud environments.
Many organizations are struggling with the process of moving data from their on-premises environments to public cloud storage, especially when dealing with large unstructured workloads such as video files, images, artificial intelligence and machine learning data, and even large volumes of standard productivity documents. In our recent study, Voice of the Enterprise: Storage, Data Management and Disaster Recovery 2022, 48% of respondents were frequently (continually or daily) migrating data to and from public cloud storage and on premises, which shows that the migration issue is not episodic or limited to major events such as data center closures and infrastructure upgrades.
With datasets getting larger, more organizations are looking to leverage provider-enabled physical transports to quickly move data to a new site. Although network-based migration is still important, organizations are cautious about starving production applications of precious bandwidth while moving large datasets. Most organizations cannot tolerate a significant amount of downtime caused by a data migration, so it's not surprising that more of them are looking to improve their data management operations. In our study, the usage of provider-enabled physical transports increased from 18% of migrations 2 years ago to 34% today.
To match requirements, organizations must learn how to leverage cloud storage services, and they must also optimize their consumption to prevent costs from spiraling out of control. Organizations can use third-party cloud cost-optimization tools to help make critical data and workload placement decisions. Once a latency-critical workload is complete, data should be stored on low-cost object storage to reduce costs and free up high-performance resources for other workloads to use. Organizations should also leverage monitoring and management tools to keep track of the changing needs of workloads.
For more on this subject, download Optimized Cloud Storage Economics.
Learn more about NetApp® multicloud storage.
Henry Baltazar is a Research Director for the storage practice at 451 Research, a part of S&P Global Market Intelligence. Henry returned to 451 Research after spending nearly three years at Forrester Research as a senior analyst serving Infrastructure & Operations Professionals and advising Forrester clients on datacenter infrastructure technologies. Henry has evaluated and tested storage hardware and software offerings for more than 15 years as an industry analyst and as a journalist.