Containers are a form of operating system virtualization. A single container might be used to run anything from a small microservice or software process to a larger application. Inside a container are all the necessary executables, binary code, libraries, and configuration files. Compared to server or machine virtualization approaches, however, containers do not contain operating system images. This makes them more lightweight and portable, with significantly less overhead. In larger application deployments, multiple containers may be deployed as one or more container clusters. Such clusters might be managed by a container orchestrator such as Kubernetes.
Containers are a streamlined way to build, test, deploy, and redeploy applications on multiple environments from a developer’s local laptop to an on-premises data center and even the cloud. Benefits of containers include:
Common ways organizations use containers include:
Users involved in container environments are likely to hear about two popular tools and platforms used to build and manage containers. These are Docker and Kubernetes.
Docker is a popular runtime environment used to create and build software inside containers. It uses Docker images (copy-on-write snapshots) to deploy containerized applications or software in multiple environments, from development to test and production. Docker was built on open standards and functions inside most common operating environments, including Linux, Microsoft Windows, and other on-premises or cloud-based infrastructures.
Containerized applications can get complicated, however. When in production, many might require hundreds to thousands of separate containers in production. This is where container runtime environments such as Docker benefit from the use of other tools to orchestrate or manage all the containers in operation.
One of the most popular tools for this purpose is Kubernetes, a container orchestrator that recognizes multiple container runtime environments, including Docker.
Kubernetes orchestrates the operation of multiple containers in harmony together. It manages areas like the use of underlying infrastructure resources for containerized applications such as the amount of compute, network, and storage resources required. Orchestration tools like Kubernetes make it easier to automate and scale container-based workloads for live production environments.
People sometimes confuse container technology with virtual machines (VMs) or server virtualization technology. Although there are some basic similarities, containers are very different from VMs.
Virtual machines run in a hypervisor environment where each virtual machine must include its own guest operating system inside it, along with its related binaries, libraries, and application files. This consumes a large amount of system resources and overhead, especially when multiple VMs are running on the same physical server, each with its own guest OS.
In contrast, each container shares the same host OS or system kernel and is much lighter in size, often only megabytes. This often means a container might take just seconds to start (versus the gigabytes and minutes required for a typical VM).
At NetApp, we believe in container technology and are working on proven tools and innovations that deliver and manage persistent storage for any application, in any location. One key example of this work is the development of Trident. Trident makes it easier than ever for containerized applications to consume persistent storage on demand.
We are actively working on ways to accelerate DevOps by promoting even more speed and agility in software development. Consuming infrastructure resources such as storage should be easy. NetApp is dedicated to making it so, with container management solutions and others that help applications more easily scale and span a wide variety of platforms.
By integrating with tools already in your DevOps pipeline, now developers, testing, QA, and operations teams can consume infrastructure resources as code.
Accelerate your containerized workloads where you choose to deploy them. Whether on-prem or in the cloud, NetApp provides comprehensive data management solutions.
NetApp solutions for Continuous Integration and Continuous Delivery (CI/CD) provide a better experience for developers and allow you to test and release software more reliably at any time.
Making it easy and on demand.
Success stories from the world's leader in data management and storage
Bandwidth, an API platform provider, delivers voice, messaging, and 911 services that touch millions of people every day. Bandwidth’s developers innovate relentlessly to evolve the company’s platform and bring new services to market faster.
With NetApp, Despegar's developers can quickly release features and updates that attract new visitors to the site and convert them into customers. Developers increased deployments of new applications and updates from 3 to 5 per week to more than 300 per day.
DevOps is the therapy that typical application development always needed. Instead of siloed, self-serving functions, software development (Dev) and IT operations (Ops) work together with end-to-end responsibility from concept through production.
NetApp helps you enable a consistent, seamless DevOps experience on your premises and in private and public clouds. That means operations can deliver automated infrastructure with less engineering, and developers can create in reliable and predictable environments with less friction and more speed.