Sign in to my dashboard Create an account

NetAppU Learns the Secret to More Valuable Training: Data

Nancy Luscri

Call it bias, but I’ve always believed that NetApp University (NetAppU) does training and education better than many of our industry peers, and the feedback we’ve received so far has supported that. But how do you quantify the value of training? At NetAppU, our primary focus and mission is delivering training that is meaningful and valuable for our learners. While we have rudimentary metrics on the success of our training programs, things like attendance rates and test scores, we wanted more. So, in the spirit of being data-driven, we decided to dive deeper.

Successful training is determined by the impact the learning has, whether it’s on job performance, productivity, or the bottom line. For each audience, the definition of success is different. For example, sales reps might consider training valuable if it increases customer visits or leads to more bookings. For customer support engineers, effective training accelerates how quickly they can help customers and resolve issues, or reduce the number of cases that are opened. Systems engineers might say a learning module is valuable if it helps them better architect a solution for customers that effectively leverages NetApp technology. Whatever those key metrics are, there has to be a tangible, measurable impact of a particular piece of training, or it is not useful to anybody.

To get this data, NetAppU is leaning on a Software-as-a-Service survey platform called Metrics that Matter (MTM). It’s built into every piece of training and education we produce, and it enables us to get real feedback that we can use to track the performance and value of each of our courses. MTM gathers data from NetAppU-delivered classes as well as training provided by our Authorized Learning Partners (ALPs).

We designed 2 different surveys with MTM that are part of every course in our learning portfolio. The first survey is administered right after a learner completes a course. It asks learners several questions about the training they received, the skills they learned, the performance of the instructor, etc. The second survey is delivered 60 days after the learner completes the course. The purpose of this second survey is to collect data on how the training and knowledge gained is being applied in a day-to-day job environment.

As we started to roll out this new tool, the data began pouring in. And what we found was even better than what we hoped for.

The following metrics are from ~1,600 evaluations collected in Q3 FY19 (November 2018 – January 2019). Where applicable, we’ve noted the industry benchmark.

  • 37 average course performance score on a scale from 1 – 5 with 5 being excellent.
  • 91% of learners said the training was a worthwhile investment in career development (industry average is 71%).
  • 95% of learners said they learned new knowledge or skills (industry average is 93%).
It should be noted that these metrics were collected from the first survey distributed immediately following the completion of the course.

While these are some really good-looking stats, they are all centered around “effectiveness.” What we were most interested in were outcomes, and that’s where the second survey comes in.

To that end, here are some of the exciting metrics we found from the second survey:
  • 85% of learners applied knowledge and skills gained to their job (industry average is 62%).
  • 60% of learners said training increased their job productivity.
  • 45% of learners said they applied the training they received within one week of completing the course or taking the class (industry average is 34%).
  • 14% of learners said they experienced a performance improvement as a result of the training they received (industry average is 9%).
In addition to many of these metrics being better than the benchmark of corporate universities in the technology industry, they demonstrate that NetAppU is providing education that is truly valuable to our learners. When we take this one step further and calculate real bottom-line benefits, it’s impressive.

Using the data from our surveys, we were able to conclude that a majority of respondents reported a 20-30% increase in productivity as a result of the training they received (productivity in this case referring to an increase in job performance). Even if we took a conservative estimate at 10%, the ROI from that increased productivity is significant.

For example, if you worked a 40-hour week, you could do 4 hours more work or do the same amount of work in only 36 hours. That 4-hour swing, when estimating based on a $100K fully burdened salary, then equates to a savings of $200/week. Multiply that by maybe 1-2,000 people in that role that took that course and you now have a major impact on ROI for a given organization. All of this data is extremely useful in demonstrating the real value of training, not just for those who take the course but for their managers as well. These are quantifiable benefits to training that can’t be underestimated.

This data is helping us understand the needs of our learners and how we can adapt our training to give them real-world skills they can use on the job today. We’re excited to see what the future holds as we continue to make data-driven improvements to our learning portfolio. Follow us on Twitter (@NetAppU) to keep up with the latest developments!

Nancy Luscri

View all Posts by Nancy Luscri

Next Steps

Drift chat loading