Too many big data initiatives fail because companies, top to bottom, aren’t committed to the truth in analytics. Let me explain.
In January 2015, the Economist Intelligence Unit (EIU) and Teradata (full disclosure: also my employer) released the results of a major study aimed at identifying how businesses that are successful at being data-driven differ from those that are not.
Among its many findings, there were some particularly troubling, “code red” results that revealed CEOs seem to have a rosier view of a company’s analytics efforts than directors, managers, analysts, and data scientists. For example, EIU found that CEOs are more likely to think that employees extract relevant insights from the company’s data – 38 percent of them hold this belief, as compared to 24 percent of all respondents and only 19 percent of senior vice presidents, vice presidents, and directors. Similarly, 43 percent of CEOs think relevant data are captured and made available in real time, compared to 29 percent of all respondents.
So why is there such a disconnect? It turns out the answer is much more human than the size of a company’s data coffers, or the technology stockpiled to analyze it. Big data initiatives fall down at the feet of biases, bad assumptions, and the failure, or fear, of letting the data speak for itself. As insights make their way up the corporate ladder, from the data scientist to the CEO, the truth in analytics is lost along the way. And this leads to a cumulative effect of unintended consequences.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Communicate the Known-Unknowns to Your CEO
Take the idea of known risks, for example. In analytics, you always have to make some assumptions because the data hardly ever paints a complete picture. So, you have to identify and rank those risks to understand what might happen when assumptions go wrong. In some cases, the risks aren’t tied to big consequences. But, in other cases, it can be devastating.
Look at the stock market crash of 2008. A whole host of people made a simple and logical assumption that home prices would only go up. But most analysts didn’t experiment enough with what would happen if prices actually fell. Well, now we know what would happen. It was almost a global calamity. The people investing in the pre-housing crisis bubble were working on an assumption that was very flawed on many levels. And very few people considered, or realized, the risk until it was too late.
The same thing happens, at generally smaller scales, in businesses. The CEO doesn’t have a clear view of risk. It is up to the data scientists, business analysts and their managers to make the CEO well aware of the risk in assumptions. The CEO needs to understand that there is a critical, level 1 risk in assumptions – in the housing example, if prices were to go down, this whole thing falls apart. Even if that risk is unlikely, at least it is on the table. Many people are uncomfortable discussing such negatives with senior executives and many senior executives don’t like to hear it. But to succeed, everyone must get past that hurdle.
Get Past the Culture of Fear of the Truth
Then there is the fear of the truth, with a bit of cognitive bias thrown in. For example, it is very common that sales people, when asked for their forecast, even armed with data on historical performance and current pipeline, are generally not sure if they are going to hit their number. But, typically, they’ll tell the VP of sales they will hit their forecasts – unless, of course, a miss is very apparent. They share the information they’re expected to share, and withhold any acknowledgement that the numbers are malleable.
The problem arises in the aggregate: The VP gets a rosy picture from five sales people on her team, even though they all have serious doubts, so she puts that assumption in and the data rolls up to the CEO, or CFO. In reality, the metric is underpinned by a huge amount of doubt. The truth is buried under the fear of losing one’s job and the cultural expectation that the goal will be met. Failure is not an option. However, while it is likely several of the sales people will manage to hit their number, the chance that they all will is small. This makes the VPs figures even more unrealistic than the initial estimates.
So what happens? Everyone is shocked when the company misses its forecast. This is an example of where people sugarcoat a little at the low end, and the cumulative effect leads to the business incorrectly forecasting company-wide results.
Don’t Underestimate the Future of the Truth
Another common problem is underestimating, or simply not considering, the confidence level in the analytics results that the CEO is being fed. Maybe we are comfortable with the data and the assumptions, we’ve asked the right questions and we’ve taken the risks into consideration, but we haven’t assessed the confidence level of our predictions. This gets into classic model assessment techniques in analytics. Is the forecast plus-or-minus 1 percent or 20 percent? If it is critical to increase sales by 5 percent and the model predicts 10 percent sales growth within plus-or-minus 5 percent, then we’re probably fine. But if the model predicts 10 percent sales growth plus or minus 15 percent, then we might be closing up shop at the end of the year if we aren’t careful.
So, What Needs to Change?
The culture around how data is viewed and data-driven decisions are made has to change. If a data scientist brings all the assumptions and risks to a boardroom conversation only to get chewed up and spat out, the next time he enters that boardroom, he’ll be sure to hide the negative truths. But, when the culture encourages curiosity and impartial acceptance of the story the data tells, then those who keep the data are free to share what they know and won’t be afraid to point to the data, all the data, and not just the rosy bits. The CEO must take the responsibility to actively ask about risks and foster a culture of transparency. But so too does everyone else. Team members at all levels need to take responsibility for holding to the truth in the data and maintaining complete transparency when communicating up the corporate hierarchy.
Executives have to ask their people to do this due diligence as they pass up the results, and they have to ask the questions back down so it becomes a conversation around data, not simply a one-sided dashboard, or presentation. In some cases, there may not be any material risk, but the fact you intelligently reached that conclusion demonstrates that you have the discipline to make the assessment. As a CEO or senior executive, you can’t assume everyone did a great job of validating all the potential risks and made all the right assumptions. You have to ask for the truth and be willing to handle it.
Bill Franks is Chief Analytics Officer at Teradata, where he provides insight on trends in the analytics and big data space. He is author of the book Taming The Big Data Tidal Wave and most recently published his second book, The Analytics Revolution. He is an active speaker and a faculty member of the International Institute for Analytics. Find him at www.bill-franks.com.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More