Commentary: Most marketing executives don’t seem to believe their analytics, according to recent Gartner survey data. Here’s one way to change that.

Image: Getty Images/iStockphoto
If your analytics software told you to jump off a cliff, would you do it? That classic parental slam on juvenile plans (at least in my home) probably does little to dissuade kids from doing what they want, just as our analytics software seems to be equally ineffectual at convincing adults what to do. That is, unless the analytics simply confirm what we already want to do.
Recent Gartner survey data has marketing executives blaming “poor data quality, inactionable results, and a lack of clear recommendations” for why they don’t trust marketing analytics. But it’s just as likely that the real problem is that they simply don’t agree with what the analytics tells them. Too often, “big data” just means “big confirmation bias,” something I called out a few years ago.
So what can organizations do to move beyond data-driven confirmation biases to true data-driven change?
Blaming the messenger
Marketing analytics is a multi-billion dollar market, but that doesn’t mean customers feel they’re getting their money’s worth. When Gartner asked marketing executives whether their analytics programs were delivering the expected benefits, 54% of senior executives and 37% of mid-level executives said they weren’t, and 19% (of both groups) were neutral on the topic.
SEE: Hiring kit: Data Scientist (TechRepublic Premium)
Though these marketing leaders tended to cite poor data quality and other factors for their lack of success, the Gartner report posited a deeper problem:
Confirmation bias plays a large role here. Marketing leaders often seek out data to help them make the case for a desired course of action or to show the value of their program. However, marketers must understand that having data that conflicts with a planned course of action is valuable and presents a unique opportunity to further challenge controversial findings through experimentation.
This finding would be easier to ignore if people (and, specifically, executives) didn’t have a long history of being data-driven… right up until the point that the data disagreed with gut instinct. On the other hand, we’ve also become adept at steering the data to reflect our biases, making even so-called “data-driven decisions” less influenced by objective data, and more a matter of confirming desired outcomes.
While not a panacea, one thing that could help with either gaming or ignoring analytics would be to open source the tools used to collect the data. It’s perhaps easier to accept the outcomes of analytics software if we have a better understanding of the algorithms/etc. used to gather and process the data. The more we understand the process behind our analytics, the more we should be able to trust that data.
With that added trust, perhaps marketing executives should do what their teams are already doing with A/B testing (for website, marketing campaigns, etc.): If they don’t fully trust the data because it conflicts with what they want to do, trust it just enough to run a smaller-scale test that follows what the data suggests is the right action. If it doesn’t work, they’d have the means to inspect and tweak the open source tooling to figure out where things have gone wrong. If it does work, it would offer them a way to develop greater trust in their analytics tools. Everybody wins.
Disclosure: I work for AWS, but the views expressed herein are mine.