With the increasing complexity of the questions that businesses ask of data, an optimal analytics strategy must focus on hybrid, blended, or multi-tier analytics
Analytics maturation and adoption can be painful in pharma for many reasons, including the complexity of questions being asked and the strategies in place to answer those questions. Many organizations continue to use a single tool or technique to distill an answer. Even complex techniques are often insufficient to answer the questions being asked, which causes frustration for the business and for the analytics teams.
One approach that has been successful outside of pharma and life sciences is a hybrid, advanced analytics approach in which techniques are tiered and integrated into a stack to answer complex questions. These stacks are usually built deductively and then tuned and tweaked as discoveries are made, but sometimes there is no defined path and exploration is king. Adopting this approach could help pharma analytics teams accelerate insights.
Each area of the pharma value chain is vastly different from the others in goals, outcomes, approaches, tools, data, culture, and people. Because of that, their analytical approaches differ. That isn’t necessarily a bad thing, but it doesn’t always help in answering complicated questions about the business.
In discovery, analysis focuses on pattern identification and recognition and on the links between layers of the -omic stack (genomics to phenomics). In development, analytics are more varied, answering questions from trial selection to in-silico trials and manufacturing optimization. In deployment, analytics focus on forecasting and prediction as well as optimization to address questions about marketing, sales, supply chain, and so on.
As you have probably seen, this complexity means that there are lots of strategies and implementations at play in any pharma or life sciences organization. There may be six or more strategies and implementations for each portion of the value chain, and possibly six or more for each department involved in that portion. These range from Excel analysis to PowerPoint BI to dashboarding to machine learning to robotic process automation and everything in between. Typically, each implementation is intended to answer a single question or a very narrow range of questions about the business, hence the breadth of solutions and variety of strategies. Unfortunately, many of these techniques and strategies interfere with one another or duplicate work.
One of the biggest (and most difficult) parts of an effective strategy is to centralize as much of the command of analytics as possible while decentralizing their execution and management.
Let’s break down that last statement. Centralizing command means that the organization puts into play governance around how analytics are executed and presented. This governance includes enterprise standards around techniques, tools, symbology, color, general user experience, and so on. This does not mean centralizing the actual work of analytics, rather the rules around doing the work.
Decentralizing execution and management means that the people who know how to do the work and have the subject matter expertise perform the tasks to answer the questions and manage how they do that. However, they do it according to the standards set forth in the centralized command function.
Once this structure is formed and is being executed, it’s time to look at how to evolve the practice of analytics in the organization.
With the increasing complexity of the questions that businesses ask of data, an optimal analytics strategy must focus on hybrid, blended, or multi-tier analytics. These terms are pretty much synonymous, with some small differences. Conceptually, an effective analytics strategy centers around the ability to construct, execute, modify, and reconstruct an analytics stack that enables exploration, insight generation, decision making, and reflection. Also, it should be possible to automate it to some extent.
As an exemplar, this strategy can be accomplished by using unsupervised learning across multiple structured datasets to identify correlations or other patterns in the data and to help eliminate outliers and potentially unrelated data. The next step is to leverage supervised learning on the target data to generate a forecast or prediction of some sort. Then comes leveraging optimization or simulation techniques to explore possible outcomes, with a dash of automation to make decisions and implement them based on an established risk profile and outcomes of the simulation. The final step is to leverage uncertainty quantification techniques to assess the entire process and to reflect upon and determine any modifications that need to be made in the process or areas that are deficient, according to established metrics.
Although it’s complicated, this model, which is based on a multi-tier analytics strategy, allows streamlined robust question answering and decision making based on the data. This solution might not have been possible before, or it might have taken far too long to execute using singleton analytics.
This loop, built around John Boyd’s OODA (observe, orient, decide, act) loop, should be the gold standard for analytics in the enterprise. The robust findings are built on a believable foundation and can be automated and deployed with little additional work from the first cycle. This approach leads to better, faster, and more efficient insights.
So why isn’t everyone on this band wagon? There are many reasons. First, data. As in other industries, data in pharma can be a siloed, disparate nightmare. To get to the point of a solid analytics strategy, it’s necessary to have a solid, mature data strategy. Second, culture. Most pharma cultures can’t yet handle yet the enormity of the governance change proposed in this blog post, and there is tremendous natural resistance. Third, knowledge. To develop these strategies, it’s necessary to have the knowledge as well as a team that can execute on and in the strategy. Most organizations don’t have this knowledge in the right role to win. And, finally, data again. Yes, it’s important enough to consider it twice. The most beautiful analytics architecture is useless without the data to support it. So, part of the analytics strategy should be a data strategy, even if it’s limited.
We hear a lot of talk about incremental success in analytics, especially in strategy. That’s because there is a wariness around analytics, especially AI, that causes people to be skeptical. The best way to demonstrate value is to deliver value. A stacked, hybrid analytics strategy can do just that. It’s easier to tie metrics and indicators to customer needs and then deliver them with a modular, platform-oriented, user-centric analytics strategy. Incremental delivery always wins (even when you fail – because you can fail and learn fast) and allows you to adjust your strategy on the fly.
Learn how NetApp® solutions enable every analytics strategy – from simple to complex – with cloud, consolidation, integration, data movement, orchestration, and analytics automation. With NetApp supporting your data and analytics strategies, you’ll join other industry leaders who are delivering transformation daily.
Chief Data Officer - Global Healthcare and Life Sciences at NetApp
Brush up on the latest trends and developments in cloud, on premises, and everywhere in between. This is where it all gets real, with a cherry on top.
Explore a wide range of open forums where you can post questions, share answers and just generally get smart on all the NetApp technologies that matter most to you.