Sign in to my dashboard Create an account
Menu

NetApp CSO 2019 Perspectives

Atish Gude
146 views

As we enter 2019, what stands out is how trends in business and technology are connected by common themes. For example, AI is at the heart of trends in development, data management, and delivery of applications and services at the edge, core, and cloud. Also essential are containerization as a critical enabling technology and the increasing intelligence of IoT devices at the edge. Navigating the tempests of transformation are developers, whose requirements are driving the rapid creation of new paradigms and technologies that they must then master in pursuit of long-term competitive advantage.

1) AI projects must prove themselves first in the clouds

Still at an early stage of development, AI technologies will see action in an explosion of new projects, the majority of which will begin in public clouds.

A rapidly growing body of AI software and service tools – mostly in the cloud – will make early AI development, experimentation and testing easier and easier. This will enable AI applications to deliver high performance and scalability, both on and off premises, and support multiple data access protocols and varied new data formats. Accordingly, the infrastructure supporting AI workloads will also have to be fast, resilient and automated and it must support the movement of workloads within and among multiple clouds and on and off premises. As AI becomes the next battleground for infrastructure vendors, most new development will use the cloud as a proving ground.

2) IoT: Don’t phone home. Figure it out.

Edge devices will get smarter and more capable of making processing and application decisions in real time.

Traditional Internet of Things (IoT) devices have been built around an inherent “phone home” paradigm: collect data, send it for processing, wait for instructions. But even with the advent of 5G networks, real-time decisions can’t wait for data to make the round trip to a cloud or data center and back, plus the rate of data growth is increasing. As a result, data processing will have to happen close to the consumer and this will intensify the demand for more data processing capabilities at the edge. IoT devices and applications – with built-in services such as data analysis and data reduction – will get better, faster and smarter about deciding what data requires immediate action, what data gets sent home to the core or to the cloud, and even what data can be discarded.

3) Automagically, please

The demand for highly simplified IT services will drive continued abstraction of IT resources and the commoditization of data services.

Remember when car ads began boasting that your first tune up would be at 100,000 miles? (Well, it eventually became sort of true.) Point is, hardly anyone’s spending weekends changing their own oil or spark plugs or adjusting timing belts anymore. You turn on the car, it runs. You don’t have to think about it until you get a message saying something needs attention. Pretty simple. The same expectations are developing for IT infrastructure, starting with storage and data management: developers and practitioners don’t want to think about it, they just want it to work. “Automagically,” please. Especially with containerization and “server-less” technologies, the trend toward abstraction of individual systems and services will drive IT architects to design for data and data processing and to build hybrid, multi-cloud data fabrics rather than just data centers. With the application of predictive technologies and diagnostics, decision makers will rely more and more on extremely robust yet “invisible” data services that deliver data when and where it’s needed, wherever it lives. These new capabilities will also automate the brokerage of infrastructure services as dynamic commodities and the shuttling of containers and workloads to and from the most efficient service provider solutions for the job.

4) Building for multi-cloud will be a choice (and you know what choices come with…)

Hybrid, multi-cloud will be the default IT architecture for most larger organizations while others will choose the simplicity and consistency of a single cloud provider.

Containers will make workloads extremely portable. But data itself can be far less portable than compute and application resources and that affects the portability of runtime environments. Even if you solve for data gravity, data consistency, data protection, data security and all that, you can still face the problem of platform lock-in and cloud provider-specific services that you're writing against, which are not portable across clouds at all. As a result, smaller organizations will either develop in-house capabilities as an alternative to cloud service providers, or they’ll choose the simplicity, optimization and hands-off management that come from buying into a single cloud provider. And you can count on service providers to develop new differentiators to reward those who choose lock-in. On the other hand, larger organizations will demand the flexibility, neutrality and cost-effectiveness of being able to move applications between clouds. They'll leverage containers and data fabrics to break lock-in, to ensure total portability, and to control their own destiny. Whatever path they choose, organizations of all sizes will need to develop policies and practices to get the most out of their choice.

5) The container promise: really cool new stuff

Container-based cloud orchestration will enable true hybrid cloud application development.

Containers promise, among other things, freedom from vendor lock-in. While containerization technologies like Docker will continue to have relevance, the de facto standard for multi-cloud application development (at the risk of stating the obvious) will be Kubernetes. But here’s the cool stuff… New container-based cloud orchestration technologies will enable true hybrid cloud application development, which means new development will produce applications for both public and on-premises use cases: no more porting applications back and forth. This will make it easier and easier to move workloads to where data is being generated rather than what has traditionally been the other way around.

Atish Gude

Atish Gude is senior vice president and Chief Strategy Officer (CSO) leading NetApp’s Corporate Strategy Office and responsible for strategy development and implementation and corporate development. His office plays a crucial role in identifying and harnessing new trends and disruptive technologies to accelerate market momentum for NetApp. Before joining NetApp, Atish served as the senior vice president of corporate strategy at Verizon Communications. There he led development of the company’s corporate strategy as well as evaluation of technologies, trends, and competitive dynamics across strategic initiatives. Prior to that he held senior leadership roles in marketing, strategy, corporate development, and operations at Verisign, Clearwire Communications, and Sprint Nextel. Atish earned a Master’s in Business Administration from the University of Chicago and a Bachelor of Science in computer engineering from Syracuse University.

View all Posts by Atish Gude

Next Steps

Drift chat loading