Part 3 – 4 ways analytics can deceive you

Last week I shared how two organizations have suffered from lack of the right data and incorrect data.  This week I finish off the 3 part series…… (Part 1, Part 2)

The last two examples relate to analytics governance. Analytics adoption began as IT-led, coordinate projects but as desktop and cloud applications became more accessible to individuals, their use spread across the enterprise. Michael Goul is an associate dean for research and professor of information systems at Arizona State University’s W. P. Carey School of Business, has spent the last few decades studying artificial intelligence and business analytics. While he agrees that data science has the potential to revolutionize commerce, he also thinks too many companies are rushing headlong into the field without putting proper governance systems in place, and in some cases this has led to disaster.

In our third example, the lack of Analytics governance provided an opportunity for fraudulent behavior. A media company was going through a rapid growth phase. They expected that their customer satisfaction, measured by their NPS score, would likely dip during this time. They were watching close for an indication that they need to ramp up their investment in customer service. However, as they grew, their NPS remained curiously constant – which seemed rather odd. After 9 months, the CEO was so concerned that they might be missing something that he brought in an outside consulting firm to help them understand how they delivered the same level of customer service while they expanded so rapidly. It turns out that the person who created the dashboard for the Executive team was using an ungovernable desktop BI solution and was being paid a bonus based on the NPS score performance. When the score started to slip, the individual who created the dashboard was faced with losing his bonus so he opened up the spreadsheet that feeds the dashboard and nudged the number up manually so that he still got paid his bonus. What started out as a small adjustment, grew to a 40% variance over the 9 month period. How many customers got frustrated with the declining level of service in that period and switched providers? Despite their attention to this metric, they got blindsided by not knowing. (misrepresented)

Our final example is the opposite side of the same coin – related to analytics governance. Executives have become reliant on dashboards as single-screen “snapshots” of performance. But dashboards are not the magic view some managers treat them as. Although they can convey important measures, dashboards cannot always provide the nuance and context necessary for effective data-driven decisions. The data can be 100% correct but the visualizations can be very misleading. Here is an example from Harvard Business Review. It is for a large package delivery company who wanted to reduce vehicle accidents by offering drivers the option to upgrade their GPS to a system that would help them avoid high-risk traffic areas. After monitoring drivers’ behaviors, a frontline manager checked the dashboard and found, to her surprise, that the accident rate was actually higher with the upgrade.Picture1

At first glance, it appears that drivers who upgraded their GPS were in more accidents, which might lead someone to suggest they downgrade back to what they had. In reality, the upgrade was actually quite effective and the manager would have seen this had they compared accident rates for “safe” drivers versus “accident prone” drivers.

Picture2

For both groups, the upgrade made them safer. So why did the accident rate increase for the entire fleet of drivers while decreasing for each group? Because in this case almost all of the accident-prone drivers chose to use the upgraded device and almost all of the safe drivers kept the old device. Preexisting driver behavior was confused with the effectiveness of the upgrade.

THE REALLY INTERESTING THING in this case? The visualizations were accurate as was the data displayed.  They just did not show the whole picture when looking at all factors leading to accidents. Joel Shapiro, Executive director data analytics program at Northwestern’s Kellogg School of Management, says “Perhaps the greatest danger in using dashboards for decision making is in misattributing causality when comparing elements on the dashboard.”

These scenarios are real and they are happening every day.  Might they be happening in your organization? …..

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s