Use our Performance Acceleration Modules (PAMs) to optimize the performance of random read intensive workloads such as file services, messaging, virtual infrastructure, and OLTP databases without adding more high-performance disk drives. These intelligent read caches speed access to your data, reducing latency by a factor of 10 or more compared to disk drives. Faster response times can translate into higher throughput for random I/O workloads.
PAM family modules give you performance that is comparable to that of solid state disks (SSDs) without the complexity of another storage tier. You don’t need to move data from tier to tier for the best performance. It’s all automatic because every volume and LUN behind the storage controller is subject to caching.
You can also reduce costs for storage, power, and rack space with our PAM family modules. We have demonstrated that the original PAM caching module can eliminate up to 50% of the disk drives in a storage system with no change to I/O throughput. Similar tests with a new PAM II module reduced the number of disk drives by about 75%.
Purchasing our PAM family modules doesn’t have to be a leap of faith because you can simulate the results of adding cache to your current storage system. Predictive Cache Statistics, a feature of the NetApp® Data ONTAP® 7G operating system, generates information that indicates whether caching modules will help and how much additional cache is optimal for your workload.
Learn more about our Performance Acceleration Module Family (PDF).