Sign in to my dashboard Create an account
Menu

Large capacity volumes for Google Cloud NetApp Volumes preview

Apr 10, 2024

Google Cloud NetApp Volumes, a fully managed, secure, and performant file storage service that supports SAP, Microsoft, and Linux-based applications, added support for large capacity volumes that provides up to 1 PiB in storage capacity and up to 12.5 GiB/s of throughput in a single volume. With large volumes, you can quickly and easily deploy a variety of workloads that need to store and process petabyte scale datasets including HPC, EDA, AI/ML training, and content repositories with high throughput (single digit GiB/sec) and concurrent access.

Large volumes enable customers to provision a single volume starting at 15 TiB that can be scaled up to 1 PiB dynamically in increments of 1 GiB. The performance of the volume scales linearly with provisioned capacity and the service level selected up to a maximum of 12.5 GiB/s. Additionally, large capacity volumes running on the Extreme service level can use multiple endpoints (IP addresses) to load-balance traffic and achieve higher performance, ideal for workloads requiring highly concurrent access to a large data set. Large volumes allow customers to bring petabyte-scale datasets used in EDA workloads, AI applications, and content data repositories to benefit from data management features like snapshots, clones, and cross-region replication now available on a single volume. Before today, customers had to select performance or capacity-optimized options for workloads with petabyte-scale datasets or run their workloads by partitioning their data across multiple volumes.

Large volumes are currently available for preview with general availability targeted for later in calendar year 2024. Additional details on large volumes can be found in the product documentation.

Drift chat loading