Letting go of yesterday

In advertising and marketing, campaign performance – and by extension, success – is almost universally measured in the context of comparison to previous campaigns.

The answer to “how did we do?” is invariably phrased in terms of “this campaign delivered X% more than that campaign, or “we had Y% less traffic this week than this week last year. We use monthly reporting, dashboards and trend modelling around past performance in order to judge how well we’re doing in the now.

This approach can be dangerously short-sighted.

Let’s say that your campaign did 20% better than your previous one. Or maybe it only delivered half as much traffic. Or it resulted in fewer conversions than last time. Is that good, or bad? Who decides?

The answer is, anyone with an opinion. Which isn’t a good way to plan or run a strategy.

The Report

The only thing which matters about how well your campaign or marketing activity performed is whether or not it performed as well as you wanted or expected it to. How well your previous campaign performed, or how much traffic your website received on the same day the previous year, is completely inappropriate as a singular yardstick for success.

In a world where monthly reports, campaign wash-ups, dashboards and constant wrangling over have-the-numbers-gone-up-or-down-today – whether it’s an agency delivering campaign summaries to a client, or an internal employee producing status overviews – ‘the report’ is an institution, an expected deliverable, and the numbers it holds are the trigger for reward and promotion, or haranguing and thinly veiled threats.

Our entire worldview is shaped by yesterday’s numbers.

Yesterday’s numbers

Yesterday’s numbers aren’t a reliable measure of relative performance. Yesterday’s numbers aren’t a signpost to whether today’s numbers are good, bad, or indifferent.

Sure, last month you did some similar marketing, so it’s not unreasonable to expect that this month’s similar marketing might produce similar results. You learned some stuff, too, and applying those findings should make things work harder this time around. But your marketing didn’t control the weather. Or the stock market. Or the phase of the moon. Your relationship to your competitors, the rest of your vertical, and the wider market is (likely radically) different than it was. Your channel activity – and that of your competitors – has likely changed, it’s closer to pay-day, and, oh – lots of people were probably on holiday around some awkwardly timed bank holidays last month.

Comparing your performance this month to last month in terms of the numbers you saw then assumes that you’re operating in a marketing vacuum. Nothing could be further from the truth.

Historical context can be useful

I’m not saying that historical performance is completely irrelevant, or that it doesn’t have a role in setting targets (and, in some cases, it might be perfectly valid to set a target that is based on an uplift from historical performance). 

What I’m suggesting is that when planning a campaign or running ongoing marketing, you really need to have some clear definitions of what success looks like.

For KPI setting, we should be working back from the highest level of business to identify how much value needs to be generated to cover costs, maintain profit margins, and deliver required uplifts. Then we might choose to get a little ambitious, whilst considering and accounting for as many influencing factors as possible.

With those thresholds defined, previous performance should absolutely be considered when setting appropriate targets against KPIs, but as one of many factors, for example, as an estimation of market size and therefore where you might expect to see diminishing returns on bought media. It’s criminal to simply set targets as a percentage uplift on previous performance.

Organizations need to learn how to set KPIs

Typically, this kind of thinking doesn’t happen because organisations need to learn how to set effective goals and KPIS. When you don’t have a clear definition of what success looks like, with proper KPIs, targets and success thresholds, it’s easy to fall back onto “well, how did we do last time?” and to apply an arbitrary uplift. Marketing managers, stakeholders and budget-holders often fall back to this kind of thinking because they’re not educated or empowered enough to do serious, proper modelling and forecasting; and because we’ve huge amounts of data, reports and spreadsheets, it’s very easy to just point at a series of numbers and demand more.

I’m going to go so far as to state that any regular reporting (note my usual distinction between reporting and analysis) which compares this month or this year against similar previous periods is, with the exception of when it’s used as part of a wider, considered decision-making process, a waste of time.

Agencies are tied to pointless monthly reports

In agency world, the whole concept of monthly reports makes me deeply frustrated. They exist to provide accountability and to chart the impact of work done, but in reality, they do little towards this and often cause more harm than good in the long run. Spreadsheets, which state that this KPI has gone up a little bit compared to last month, or that KPI has gone down a little bit compared to last year, do nothing but breed mistrust. 

The people producing the reports take no value from them because structured reporting of vanilla data doesn’t generate or allow for insight and analysis. 

The recipients get nothing but unanswered questions because the report doesn’t say why things have changed – and it’s often simply because stuff is different from before.

And that’s not the kind of narrative that engenders trust, so we collectively spend more time attempting to create theories as to what happened a year ago around a campaign that was influenced by a vast number of immeasurable external influences which sufficiently satisfy all the parties involved than we do actually influencing the numbers

I’m sure that people in-house, working within even the most data-driven organisations, have the same challenge.

Reports should be framed with expectations

Monthly reports should state what we hoped and expected to achieve now, whether we did so or not, and what needs doing next. Comparing this month to last month’s data as a regular or decision-making exercise is a poor use of time. Of course, we want to achieve more than we did last year, but using that as an arbitrary measure of success is a risky alternative to setting deliberate, considered targets for the now.

So let’s let go of yesterday. Let’s look at where we are now, and whether or not we’re where we want to get to. I’d imagine that most of the time, the answer will be “no, we’re not”.

So go do something about it; use the time you would have spent looking through your analytics package at how many visitors you got this time last month.

Leave a Reply

Your email address will not be published. Required fields are marked *