Analytics - Save the Sinking Ship
Ever heard of beer and diapers case study from Wal-Mart? Or the one where Target identified that a teen was pregnant by her online activity? If you have been reading on the internet, you would agree that the prospects of analytics seem nothing short of bliss. However, if you are a CEO, or someone in the leadership, who has spent a vast amount of money and time to set up this division in your organization, well, most of you will agree that analytics has fallen short of promises by quite a bit of margin.
I have worked as a consultant for companies like Ebay, Walmart, and General Motors. I have also set up analytics for various mid-level organizations. During my stint, I have seen many happy clients, however I have also seen many organizations that were completely gripped by case studies, but they are only shutting down these verticals, or returning back to what they used to have, because of lack of revolutionary results.
With your patience growing thin, I understand if you are also feeling disheartened. But, let me tell you that analytics works. It just does. Let me break it down why it doesn't work for some organizations, and what can you do to avoid such pitfalls. [If you're an entrant in this field, read on to avoid making mistakes right from the start.]
Organizations starting deep investments into analytics, sometimes, want to replicate these case studies for themselves, however most of them are just myths. I worked with one which quoted the beer and diaper case study and insisted their big data team to uncover trends through association rule mining. Only after 5 months, they realized that the team wasn't going anywhere. This is where I came into picture. I did their thorough analytics restructuring and chose to solve their real problems at hand, and the results were phenomenal. (I also turned the association rule results into something awesome— I will discuss that in another article).
One example of a problem that I worked on was filter navigation. I simply took the data for last 30 days and ranked the filters by usage and by the order they were used in (weighted equation). I found that this organization had their filter sequence all wrong. We made the sequence according to the calculated rank and did an A/B test, and we were mind blown. Engagement on catalog pages increased by 15% and the conversion was up drastically (confidence > 95%). These numbers were humongous considering a website that gets millions of users. I couldn’t wrap my mind around it, and it sounded like case studies in which you change the color of your text and your conversion dramatically improves. It seemed beyond logic, but was it?
While the catalog of eCommerce is getting bigger, product discovery has become one of the biggest struggles of e-tails. eCommerce websites offers a plethora of options to ease this process, like internal search, etc. But do you know what the most used feature for product discovery is? You guessed it right—Filters. People, who don’t use search, use filters, and even the people, who do use search, they also use filters. In fact, for most organizations, around 70% of their visitors and more than 95% of their customers use filters. What if we made their lives easier by giving them options they value on top? Would they find an easier and quick navigation useful enough to make a purchase? Well, the answer was an overwhelming “yes”.
We accomplished phenomenal results just by looking inwards at our own problems and UX. Google and Adobe also agree with me. They both have a similar strategy to implement analytics to maximize the output for the result oriented organizations. Here is what they have to say:
If you study both charts, you would spot many similarities, but something that I want you to note is step 1. For both organizations, the first step is “Define”. This is all about defining your current problems and business objectives, and it all starts from here.
You may think that the best engineers are often the solution for all analytics related problems, but initially you also need an evangelist who would go to multiple business stakeholders and ask them what’s on their mind. Most times, the best problems to solve come from people on desks within the premises rather than fancy companies overseas.
By putting all the aspects, where analytics can help, on paper, you can now prioritize the objectives that analytics can quickly solve to gain momentum (step 2). This also lets you get these stakeholders on board with analytics as a framework. Often, it is the internal culture which is at fault for not letting analytics prosper, as analytics in itself is a culture of being data driven. Early successes and a bit of evangelism can help mitigate such issues. In fact, I have seen non-believers jump with joy when they see what analytics can do for them.
Once the prioritization of objectives is done, an analyst would have to create a data collection strategy which your tech team can implement (step 3). Much can be said about data collection strategies as it is a crucial step, but once that is done, it is time to Analyze and Optimize (step 4 and 5).
Just like a software development cycle, analytics may also go through multiple iterations where you keep analyzing/investigating further as you find more patterns. However, the clear objectives from stakeholders keep target in sight.
Another point of failure for organizations is execution based on analysis. I have seen umpteen times that analysts are sitting on a great recommendation only because a category manager does not want to touch it, or the marketing team is not ready to roll out such a campaign. An organization must find a way to A/B test these recommendations by analysts. This enables an organization to find the best results iterative-ly, and it also keeps the momentum going along with the teams feeling energized seeing definite outcomes from their efforts. There is nothing more discouraging than a team working on a problem for 3 months and no one in the org to implement it. Do you know that Google A/B tested the first time on Feb 27, 2000, and in 2011 they did 7000 A/B tests for google search alone?
Lastly, Rome wasn't built in a day and neither will your Prescriptive Analytics Platform. An org must find a way to graduate to next levels of analytics with time. Failure to do so may cause frustration, and it may become a glass ceiling. [I will cover some strategies to overcome this in upcoming articles.]
Conclusion: Analytics definitely works, but many organizations need to rethink their strategy. While case studies are great to draw motivation from, starting out from there could do more harm than good. I will write about advanced data mining in upcoming articles, but solutions are most impactful when we look and focus on our own problems. We need to constantly ask ourselves the objectives where analytics can help and move towards them. Gradually, the objectives themselves will start maturing and become more complex, and this will be an indicator that you are headed in the right direction. However, we must stay focused on problems at hand to make sure that we are solving important problems quickly to gain momentum. Organizations must also focus on the execution part as the solutions start pouring in from analytics team.
It’s easier said than done—I know. However, there is light at the end of this tunnel. I have seen it, and I am sure you will too.
Hope to read your case study soon.
-Prateek Mehta
Senior Professor in premier B-Schools in Delhi/NCR
5yCan you please share your contact details. Looking forward to invite you for the International Conference