If you are a brand marketer or work at an agency, time is a limited resource that’s always in demand. If you work on digital campaigns, that’s only amplified, as the abundance of data sources and metrics means you can easily dedicate 20 or 200 hours to the same campaign.
It’s important to recognize the point of diminishing returns and when some common digital practices are simply not worth the time investment. Here are three of the biggest offenders.
UTM or site tracking codes append to a landing page’s URL and make it easier to organize campaign traffic neatly in a website’s analytics. They can include several parameters that differentiate between things like source (e.g. Facebook, YouTube), medium (e.g. banner, video) or creative concept. By default, we set up UTMs for all of our campaigns to ensure that tracking is present since it can’t be applied retroactively, but aim to use only the broader parameters.
It’s tempting to go too deep on tracking paid traffic. Depending on the number of tactics, creative and ad sizes, some campaigns can have hundreds of UTMs. Not only does that take a significant amount of time to build out, it slows down the time to set up the campaign and increases the chances of error.
More importantly, it doesn’t produce meaningful data because it only measures ad clicks. Outside of paid search, a large portion of these will be accidental, and the remainder are mostly out of audience, bots or people that don’t go on to take the actions you want anyway.
If you’re working on a performance campaign aiming for sales or some other measurable action, often over 90% of those conversions will come from people who were exposed to the ad but did not click and ended up back on the site later (view-through conversions). So aside from the increased time to get the campaign live, the extra effort to over analyze a <10% sampling of campaign activity is unlikely to unearth many meaningful insights.
Unrealistic Journey Mapping
The most successful campaigns use smart creative strategies to keep messaging relevant based on the actions someone has taken. A simple example is customized retargeting creative since someone who has visited your website is likely more qualified to become a customer than someone who has not. More comprehensive strategies will map out retargeting creative based on specific page visits, creating an even stronger connection with the audience.
But some creative strategies are unrealistic in how they represent the user journey. We’ve seen campaigns using incentives like contesting and coupon codes to nudge people over the finish line, but then they link to isolated landing pages that aren’t connected to the main website and aren’t indexed in search. That means if you see the ad and want to take advantage of the offer, the only way you can do so is by clicking the ad. You’re asking someone to do something they inherently do not want to do – and often will not do – rather than organizing the campaign in a way that they could find the same offer later on their terms.
Even if you get an ad in front of one of those rare unicorns that will click and go on to convert, how frustrating is it if you aren’t ready to take action when you first see the ad, forget the code, and are then stuck waiting for the ad to surface again?
Subtle A/B Testing
We’ve talked in detail about some of the challenges with A/B testing and there are very few scenarios where it doesn’t make sense to do it. Digital is a test and learn medium and it’s imperative to always be pushing for stronger results.
It’s important to recognize, though, that proper A/B creative testing means higher production costs, more complicated campaign setups, and, notably, extra effort in isolating and reporting results. That can become wasted time if your A/B test is of a very subtle nature, such as a minor copy tweak or call to action.
Why? Because the truth is, people just aren’t that into your ads. They are often scrolling past in milliseconds, capturing them in peripheral, and not paying attention to the details. Repeated exposures will help to paint in some gaps, but when it comes to testing, go big or go home. Compare larger concepts, like photos vs illustrations, or people vs product, or blue vs red, then measure correctly, and you can be more confident that the results are worth basing future decisions on.
Everyone wants smart, data-informed insights from their digital campaigns, but figuring out what numbers are actually meaningful and focusing on the things that really move needles will result in better performing campaigns and a lot less wasted work.