I’ve been around in the industry long enough to remember setting up my first few “clusters” built from two machines the size of industrial refrigerators. One of my most anxiety-producing memories is carefully laying SCSI cables as thick as my thumb between the machines to a box full of disks and measuring distances and cable flex, knowing that a miscalculation could result in possible data loss—and maybe even job loss.
Compared to that, the ease of installation and multihost connectivity that came with Fibre Channel SANs for open systems was almost miraculous. The Evaluator Group expressed the value I saw back then elegantly in the free SAN Storage Evaluation Guide:
"Storage Area Networks (SANs) allow storage systems to be shared by multiple computer servers over a network specifically designed to carry storage data. This flexibility provides storage administrators the ability to more easily adapt to changing business requirements."
Consolidation, flexibility, improving costs, and increasing the speed of business remain key benefits of shared storage. Many customers still use this combination of shared storage with a purpose-built network as the foundation of their data management infrastructure.
Yet, after all this time, is this still the right approach?
The short answer is yes, SAN is still very relevant. The market is growing. New technologies like NVMe over Fabrics make it a better choice for more workloads than ever before. And the promise of reducing costs while increasing flexibility is as true today as it was 20 years ago.
But what about the different vendor offerings? For the most part, any reputable vendor can offer as much performance, reliability, and capacity as anyone could need. How can a customer pick the vendor that’s right for them?
One answer is to just treat them all as commodities and pick the cheapest one that meets the basic requirements. There’s nothing intrinsically wrong with that approach. But if you take a closer look, you’ll see that a modern SAN has much more to offer than just large amounts of fast, reliable block storage.
This is where resources like the Evaluator Group document come in. The way they’ve broken down the evaluation into 10 simple yet valuable criteria can help you sift through a lot of information quickly and make a better decision. Now, you might say I’m biased. But when you look at the Evaluator Group’s analysis of NetApp® AFF storage systems and compare it to other industry leaders, it stacks up remarkably well.
Let’s take Dell EMC’s three main SAN offerings: PowerMax, PowerStore, and Unity. The Evaluator Group reports that NetApp AFF technology surpasses Dell EMC PowerMax in 3 of 10 EvaluScale criteria; PowerStore in 6 of 10; and Unity XT in 5 of 10. Unlike Dell EMC products, NetApp AFF systems didn’t receive an “area for development” rating for any of the 10 criteria.
Based on the Evaluator Group’s independent view, AFF is unsurpassed by anything Dell has to offer.
Now, going back to the idea of picking the one with the best pricing, you might expect that AFF technology would be the most expensive. After all, it’s based on NetApp ONTAP® data management software, chosen by Azure, IBM, and AWS as the technology behind their native cloud storage offerings. But no, price was one of the two areas where AFF beat every one of Dell’s SAN storage offerings.
There must be a catch, right? AFF has more features, and according to Forrester, it also has the best price... maybe all that software value slows things down? Again, based on the Evaluator Group’s analysis, AFF is the only array available from NetApp or Dell EMC that meets the performance requirements of today’s modern SAN. That’s a big call, but one I think NetApp can stand behind.
If you’re looking for a new SAN, why would you keep looking at Dell, who forces you to choose among features, performance, and price, when you can get all three with NetApp? For more on this topic, read the white paper Why your SAN needs NetApp for virtualization and enterprise apps.
Ricky Martin leads NetApp’s global market strategy for its portfolio of hybrid cloud solutions, providing technology insights and market intelligence to trends that impact NetApp and its customers. With nearly 40 years of IT industry experience, Ricky joined NetApp as a systems engineer in 2006, and has served in various leadership roles in the NetApp APAC region including developing and advocating NetApp’s solutions for artificial intelligence, machine learning and large-scale data lakes.