In marketing, there are countless myths. Half-truths and statements that have been stated so many times, they may think themselves true despite all evidence to the contrary. In particular, there are five myths that I’ve heard so often in this industry that I’m setting the record straight by, well, busting them wide open. These beliefs are common, but I think that as marketing professionals, we should be empowered to challenge ideas, ask if they’re accurate and if they should even be stated at all. Here we go!
Myth #1: “If I’ve got an analyst and I’ve got Tableau, then I can check the box. I’ve got marketing analytics covered.”
Having the tools is a good start. If you have Tableau, you are indeed set up to create reports. Every marketing organization has an analyst and a data visualization tool, so Tableau software is one of those tools. Sure, it could be a host of other software tools too, but ultimately, Tableau is nothing without someone who can utilize its data-visualization software features.
Without that—an analyst with Tableau—consider where all the data would have to sit. It would be in Excel files, since they’re pulling it out of PowerPoint and other random technological locations. That poor person is suffering through an excruciating, recurring process—daily or weekly—of a lot of munging and crunching of disorganized data, especially if they’re in the marketing space pulling data from different systems.
So, to really do analytics well, think bigger than Tableau and a staff of one. Think about your institutional objectives. Is that just media reporting, or more than that? If you only have a couple of sources of data, you might be able to get by with Tableau reporting. But, if you’re trying to do real analytics, then the bigger concept of a data visualization component has to be a factor. You also need someone who can manage all that data, and think about it critically. The whole thing here is that, yes, analytics can mean a lot of different things, but Tableau and a reporting person probably aren’t cutting it for you.
Myth #2: “Display media click-through rate is a metric that matters. It’s a metric that I should be concerned about.”
This one comes up often. In reality, most of the time, click-through doesn’t matter. Not to say it never matters, but it typically doesn’t when you really boil it down. Twenty years ago, click-through mattered. It was a big deal to be able to measure it in regards to online advertising. You could look at CTR as an indication of people using your ads and caring about the message, and it was in indication that people liked what you had.
Now, it doesn’t matter as much. Now, it really only matters when the action for clicking on a banner is the single most critical thing you’re seeking to accomplish in a campaign. And we rarely have campaigns that only hinge on whether or not someone clicks on a banner these days. To only look at click-throughs is a false front. If we’re trying to drive store traffic, we should be measuring store traffic, not click-throughs. Or brand affinity or likability or awareness, which all track more meaningful metrics than what, 20 years ago, we were unable to quantify or measure reliably.
There’s another fundamental issue with looking at click-through rates. If you look at industry averages, CTRs will be less than 1%. So, what we’re trying to optimize and understand is a benchmark on something that out of the gate, has a 99% failure rate!
And that doesn’t even consider the click fraud that we simultaneously try to uncover and *hope* is legitimate. There are so many issues at the core of clicks, the nature of clicks and whether they’re not even important or not. Knowing what we know now, in most cases, CTR is not going to be a metric we want to track or rely on.
Myth #3: “I should spend 1% of my media budget on analytics.”
This subject is also everywhere and it’s tough because it’s partly true! If we’re spending between $5 and $10 million dollars, from a media or marketing perspective, 1% of that spend will probably put us in the range of what we need to spend or think about spending on analytics. The problem comes into play when we’re on the low end. If we’re spending less than $5 million—between $500,000 and $1 million—we’re not going to spend just $5,000 or $10,000 on that analytics package. We need substantially more budget.
The best way to go about it is to think about what we’re trying to accomplish: Are we just worried about making the media as efficient as possible? If that’s the only concern, then spend might be really low.
If we focus on specific business outcomes, we get two points of data to measure against. One is just the raw cost of accomplishing those objectives. To understand full business impact—taking things like pricing and distribution into account—we need a broader look at the business. And this warrants a higher spend; it’s not marketing we’re looking at anymore.
This lets us gauge the cost against how much money is actually being produced. It’s easier to swallow a $150,000 or $200,000 spend doing analytics for a $2 billion business when you’re looking at the whole picture, versus a $1 million budget just for media. Look at the whole puzzle rather than just one piece. Establishing a benchmark or budget for analytics is less relevant as a percentage of your media spend, because there will be huge variants there.
Myth #4: “Industry benchmarks are a good way to measure our performance.”
Well, the answer in short here is no. Industry benchmarks are so generic and non-specific that they are meaningless without specific context or reference. Frankly, they’re often used by people looking to cover their rear; to give some kind of projection or point of reference that makes them look good. But, it could be really misleading because these industry benchmarks are often just an average of the status quo. A benchmark cannot answer the question “where should we be?” without also understanding where the benchmark originated.
There’s another area where industry benchmarks fall short. If we’re trying to innovate, or do something different, we can’t necessarily compare ourselves to an industry benchmark that, at the very least, is based on some idea of how everyone else is already doing it.
It’s also incredibly easy to use a benchmark to selectively serve the work. How many reports with a benchmark reference have placed campaign work dramatically below the guideline itself? To be fair, benchmarks don’t really hurt anything, but if we’re spending hours trying to find the right one for our actual campaign data, it’s a terrible use of time and resources. Making decisions based on a comparison can be a dangerous place from which to plan and execute.
Myth #5: “I’ve got big data and I’m doing good analytics that will provide certainty in predicting outcomes of my business, for my business.”
With this one, certainty is the key word. We are all in this marketing world, and we’re dealing with humans, and humans typically make irrational decisions, right? It’s hard to understand, but that’s marketing. We’re trying to influence decisions by helping people understand that our product is a good product that meets their needs, is easy to buy and is the right price.
So, with all that said, it’s difficult to put certainty into an equation. It’s impossible, given the improbabilities of human nature! We should think about how big data, analytics and all the stuff we focus on drives confidence. Drive confidence in the decisions that we need to make or should be making—not certainty.
Forecasting is a really good example because we take attributes of what we think is going to happen across a business or marketing campaign. Then, we put them into a quantitative algorithm to spit out some kind of number—the effect of the media, the growth in business, you know, whatever the objective is. And even in that scenario, it’s an estimate, and that’s where working with partners is a great space.
Where we should be mindful, is in instances of companies like big data providers, the tool sellers and manufacturers that sell certainty. Unfortunately, that just doesn’t exist. In areas where we’re not reliant on human decisions, yes, there is certainty, but people make decisions very differently. Business decisions are, in many ways, bets. Guesses. While good analytics reduce risk, they never ever guarantee a winning bet. But, that’s what make this industry exciting, right?