A couple of months ago I discussed the rise of the IT generalist who possesses both the knowledge and soft skills that allow them to be adaptable to a wide array of work environments and technologies. This movement reflects the common refrain from IT leaders to “do more with less” but it doesn’t stop with the human resources. Streamlining staff efficiency is dependent on three core elements of IT infrastructure: cost, performance and automation.
As most IT professionals already know, the actual cost of any data storage solution isn’t just your initial purchase price, it’s the total of all the associated costs including support and additional human resources needed to maintain these systems. Matt Watts recently penned an excellent blog post that calls out the risks of costly support agreements that purport to deliver free controllers 3 years into a 6-year commitment. That “free” controller could end up costing you anywhere from 50% to 300% more in support costs. These types of programs aren’t built for convenience, instead they are designed to trap you into high margin maintenance programs.
Doing more with less also means driving greater value in performance from your IT investment. Last month, John Martin provided a deep dive on the benefits of extending the NVMe protocol all the way to the host to drive efficiencies in application performance. In high transaction environments, where time is money, lower latency and higher speeds deliver immediate value. There’s no benefit to paying more for less performance.
The final piece of the trifecta is automation. In IT this is generally about protecting your data, predicting performance anomalies and preventing disruption without taxing the IT generalist with routine monitoring tasks. There are 5 key goals for automation of data storage:
For NetApp customers, the Active IQ engine uses machine learning, predictive analytics and community wisdom to create actionable intelligence that allows IT to prescriptively optimize their NetApp environment.
Many vendors provide some level of monitoring, reporting and alerting services, but Active IQ takes a broader approach, enabling customers to leverage the insights learned from the massive and diverse NetApp user base. Each day, Active IQ receives telemetry data from more than 300,000 assets around the globe, adding to a multi-petabyte data lake that processes over 10 trillion data points per month. By using predictive analytics and community wisdom, Active IQ provides customized insights and recommendations to protect and optimize your NetApp environment.
Active IQ also provides data center–wide insights and recommendations while leveraging Active IQ Unified Manager to troubleshoot, automate and customize monitoring and management. With Active IQ Unified Manager, you can set up automated remediation actions and active management. You can also customize operational reporting real time for critical infrastructure health as well as schedule polling intervals for performance and capacity.
Active IQ and Active IQ Unified Manager provide a comprehensive optimization of your data storage environment by delivering:
These tools provide truly unique capabilities that extend the value of a data storage infrastructure while automating tasks that would otherwise require specialized teams.
By managing costs, driving performance and delivering automation, IT can do more with less.
Greg Knieriemen is a NetApp Chief Technologist. He helps develop and drive the vision and application of NetApp products and solutions. Previously, Greg worked for Hitachi and was the founder of the Speaking in Tech Podcast. Greg has over 15 years of experience using, deploying and marketing enterprise IT solutions.
Brush up on the latest trends and developments in cloud, on premises, and everywhere in between. This is where it all gets real, with a cherry on top.
Explore a wide range of open forums where you can post questions, share answers and just generally get smart on all the NetApp technologies that matter most to you.