The backdrop to this week’s post is a paper by Germann et al. (2013): Performance implications of deploying marketing analytics. The paper gives a sobering narrative of the use (or not) of analytics in marketing and the impact this can have on the improvement of the overall decision making process; not just for marketing practitioners but also for customers / audiences.
A couple of stand outs for me from the article were that in a recent study of 587 executives from large International companies, only 10% of the companies regularly deployed marketing analysis. The biggest push back on using analytics was that it slowed down the business and caused “analysis paralysis” (which is the inability to make decisions due to having to wait for data). (Peters and Waterman 1982)
The term “analysis paralysis” for me takes on a completely different form. It is not so much about not being able to make a decision due to waiting on data but similar to consumer choice paralysis, it is more about not being able to make a decision because of having too much information. Therefore, the bigger issue is not what to report on, but finding the balance between having the right data to tell the story, versus the management’s ability to actually process the information and then make decisions.
One of my key deliverables at work is to compile reports on what was happening across our social media landscape. High level results (vanity metrics) provide the management team with a quick snap shot of how we are travelling. But what do these figures even mean, and are they even useful?
Understanding on what to report, and how to report it, is key here. All communication, including Social Media campaigns should be based around some form of strategy. As Avinash Kaushik notes in his article Digital Marketing and Measurement Tool, the strategy should contain clear goals and objectives, so the reporting should reflect and align with this. Some reporting you can cover with vanity metrics but in most cases you have to dig deeper, much deeper to be able to fully understand and convey the results.
Simple charts, high level results and key metrics will help get the message across much better than complicated spreadsheets full of data and often a simple infographic can be used as a summary. Adding a narrative to help explain the results is very important and if your report has a lot of terms that people may not understand (such as impressions, mentions etc.) a simple glossary can work wonders.
Social Media Analytics
To help explain how Social Media analytics and metrics works I have broken it down to three specific levels of information.
- High-Level Reporting (what the result was)
- Analysis (establishing what is happening in the data)
- Interpretation (determining what the results actually mean and any implications)
High-Level Reporting (Vanity metrics)
This is high-level information that is provided in the form of metrics for specific attributes such as impressions, engagement, number of likes, shares, retweets, video views etc. Management love vanity metrics. They are easy to understand and it provides some indication that “something” is happening. Although, without additional information they can be very misleading. For example, “the results for Facebook comments this week are 200% better than last week”. What a great result, right?? Well that really depends on whether everyone is commenting on the fact that they think your brand sucks or not, or maybe the week before was just the worst ever and this one isn’t much better, it’s just better than last week. It’s easy to get caught up in the big numbers and positive results without really understanding what they mean. That’s where analysis comes in.
Analysis (examining the data)
Breaking down the metrics to look at things such as; trends over time, performance against benchmarks, performance against competition, platform comparison and so on. In simple terms this is looking at the data at a deeper level to identify things that are happening within the data. Analysis is an important step in identifying “what” is happening to your Social Media activity but often lacks the “why” and this is where the interpretation of the results come in.
Interpretation (deep level analysis of data and narrative explaining the results)
Data interpretation is a specific skill set that is seldom used or sufficiently resourced yet is one of the most critical stages of the whole data analysis process. It is much more than just looking at the data and picking trends or movement in the data, it is about understanding the results and their implications. It is about identifying not only what we know from the data but also what we don’t know and then filling that gap with factual data, knowledge and experience to identify opportunities and then communicate these findings in the form of a narrative to aid the decision making process.
In the section where I work, we no longer have a dedicated resource for marketing analysis and interpretation as it is not seen as a priority. Yet the interpretation stage is where a good analyst will start questioning the results, and will prove (or disprove) theory or assumptions to provide an evidence based narrative. Without interpretation of the results by someone that understands the data and the situation it represents, it is left to the audience to work out what’s happening and that could be a recipe for disaster.
For example, during a recent campaign an announcement was made by one of our employees that a video that we had released on social media was the most successful yet. This statement was made entirely based on vanity stats (the number of times the video was viewed). The data was analysed further and the resulting trend did in fact show the number of video views was higher than any other video in the previous six month period. High-five everyone, break out the champagne!!!
When it came to interpret the results however this wasn’t the case. The objective of the campaign was to encourage viewers to watch the video until the end. The previous campaign had been run on YouTube and the most recent one was an embedded video on Facebook. It also transpired that the number of video views also included auto play on Facebook and a deeper dig in to Facebook insights showed that 95% of those that viewed the video did so for less than 10 secs. In terms of the objective, this was actually one of the most underperforming campaigns to date and highlights why interpretation of the results is such a critical step. Without it, the next campaign would have just followed the same course.