Ten red flags signaling your analytics program will fail

Here is a brief excerpt from an article written by Oliver Fleming, Tim Fountaine, Nicolaus Henke, and Tamim Saleh for the McKinsey Quarterly, published by McKinsey & Company. To read the complete article, check out other resources, learn more about the firm, obtain subscription information, and register to receive email alerts, please clickhere.

To learn more about the <em>McKinsey Quarterly</em>, please click here.

*     *     *

Struggling to become analytics-driven? One or more of these issues is likely what’s holding your organization back.

How confident are you that your analytics initiative is delivering the value it’s supposed to?

These days, it’s the rare CEO who doesn’t know that businesses must become analytics-driven. Many business leaders have, to their credit, been charging ahead with bold investments in analytics resources and artificial intelligence (AI). Many CEOs have dedicated a lot of their own time to implementing analytics programs, appointed chief analytics officers (CAOs) or chief data officers (CDOs), and hired all sorts of data specialists.

However, too many executives have assumed that because they’ve made such big moves, the main challenges to becoming analytics-driven are behind them. But frustrations are beginning to surface; it’s starting to dawn on company executives that they’ve failed to convert their analytics pilots into scalable solutions. (A recent McKinsey survey found that only 8 percent of 1,000 respondents with analytics initiatives engaged in effective scaling practices.) More boards and shareholders are pressing for answers about the scant returns on many early and expensive analytics programs. Overall, McKinsey has observed that only a small fraction of the value that could be unlocked by advanced-analytics approaches has been unlocked—as little as 10 percent in some sectors. And McKinsey’s AI Index reveals that the gap between leaders and laggards in successful AI and analytics adoption, within as well as among industry sectors, is growing (See Exhibit 1).

Artificial-intelligence (AI) adoption is occurring faster in more digitized sectors and across the value chain.

That said, there’s one upside to the growing list of misfires and shortfalls in companies’ big bets on analytics and AI. Collectively, they begin to reveal the failure patterns across organizations of all types, industries, and sizes. We’ve detected what we consider to be the ten red flags that signal an analytics program is in danger of failure. In our experience, business leaders who act on these alerts will dramatically improve their companies’ chances of success in as little as two or three years.

1. The executive team doesn’t have a clear vision for its advanced-analytics programs

In our experience, this often stems from executives lacking a solid understanding of the difference between traditional analytics (that is, business intelligence and reporting) and advanced analytics (powerful predictive and prescriptive tools such as machine learning).

To illustrate, one organization had built a centralized capability in advanced analytics, with heavy investment in data scientists, data engineers, and other key digital roles. The CEO regularly mentioned that the company was using AI techniques, but never with any specificity.
Would you like to learn more about McKinsey Analytics?

In practice, the company ran a lot of pilot AI programs, but not a single one was adopted by the business at scale. The fundamental reason? Top management didn’t really grasp the concept of advanced analytics. They struggled to define valuable problems for the analytics team to solve, and they failed to invest in building the right skills. As a result, they failed to get traction with their AI pilots. The analytics team they had assembled wasn’t working on the right problems and wasn’t able to use the latest tools and techniques. The company halted the initiative after a year as skepticism grew.

First response: The CEO, CAO, or CDO—or whoever is tasked with leading the company’s analytics initiatives—should set up a series of workshops for the executive team to coach its members in the key tenets of advanced analytics and to undo any lingering misconceptions. These workshops can form the foundation of in-house “academies” that can continually teach key analytics concepts to a broader management audience.

2. No one has determined the value that the initial use cases can deliver in the first year

Too often, the enthusiastic inclination is to apply analytics tools and methods like wallpaper—as something that hopefully will benefit every corner of the organization to which it is applied. But such imprecision leads only to large-scale waste, slower results (if any), and less confidence, from shareholders and employees alike, that analytics initiatives can add value.

That was the story at a large conglomerate. The company identified a handful of use cases and began to put analytics resources against them. But the company did not precisely assess the feasibility or calculate the business value that these use cases could generate, and, lo and behold, the ones it chose produced little value.

First response: Companies in the early stages of scaling analytics use cases must think through, in detail, the top three to five feasible use cases that can create the greatest value quickly—ideally within the first year. This will generate momentum and encourage buy-in for future analytics investments. These decisions should take into account impact, first and foremost. A helpful way to do this is to analyze the entire value chain of the business, from supplier to purchase to after-sales service, to pinpoint the highest-value use cases (See Exhibit 2).

To consider feasibility, think through the following:

o Is the data needed for the use case accessible and of sufficient quality and time horizon?

o What specific process steps would need to change for a particular use case?

o Would the team involved in that process have to change?

o What could be changed with minimal disruption, and what would require parallel processes until the new analytics approach was proven?

*     *     *

Here is a direct link to the complete article.

Oliver Fleming is a senior expert in McKinsey’s Sydney office and chief operating officer for QuantumBlack Australia; Tim Fountaine is a partner in the Sydney office and the leader of QuantumBlack Australia; and Nicolaus Henke and Tamim Saleh are senior partners in the London office.

Posted in

Leave a Comment





This site uses Akismet to reduce spam. Learn how your comment data is processed.