If you look at analytics and see “facebook”, “face book”, and “FB” in the utm_source, that company is not going to make it. Every analysis just got 10x harder because you have to account for all the misspellings in every filter. Garbage in, garbage out.
In order to record the source of traffic, analytics relies on UTM tracking codes in the URL...
I understand it has been busy, and people make mistakes
But just the other week I got a report where the numbers were all wrong because someone forgot to add “FB” into the filter in analytics
We missed out on 20% of the actual Facebook traffic
This is just one of a number of incidents, and the analytics team are complaining analysis is taking too long because of it
I need you to make a concerted effort to standardise your work
Let’s create a template spreadsheet, a URL builder, where we can ensure everything is standard
Then if we use that religiously going forward, there’s no way errors can creep back in
Get this done by the end of the week and send it my way
This course is a work of fiction. Unless otherwise indicated, all the names, characters, businesses, data, places, events and incidents in this course are either the product of the author's imagination or used in a fictitious manner. Any resemblance to actual persons, living or dead, or actual events is purely coincidental.
In analytics it really is a case of garbage in, garbage out. Most analytics platforms record the source of a session once, and that data is impossible to change once recorded. What that means is that if you ever make a mistake in the spelling of a tracking parameter, or use a non-standard name, your analytics team will live with that mistake forever. For example “facebook”, “face book”, and “FB” are all treated as different sources, so every time you want to report on the performance of Facebook ads, you have to remember to filter for all three of these names.
The best teams adopt a strict naming policy in order to protect against this. By standardizing the names you use and recording them in a spreadsheet, you minimize the chances of a mistake, but also make it easier for analysts to later unearth what names were used and what they mean, making their analysis much easier. When enforcing a naming protocol in a spreadsheet, you can even make the task of constructing the URL easier, because you can use data validation and dropdown menus to make it so you only pick valid values.
The way the spreadsheet works is two fold. One you make a lookup sheet with all valid values for source, medium and campaign. Second you use the CONCATENATE formula to glue everything together into a valid URL at the end, for the ops team to copy and paste into their campaigns. The system I’ve found most useful for adding granularity to the codes used is an audience, creative and message ID system. Each utm_campaign is a combination of these IDs, for example a1-c1-m1 refers to audience 1, creative 1, message 1. This maximises flexibility while also keeping the URLs relatively short and uncomplicated.
Hey, I'm going to teach you how to build a URL builder. So this is a way to keep all of your tracking consistent and make sure you're not making any mistakes or kind of reusing. Or tracking codes when you're adding the UTM parameters to your URLs. So for example, who running ads on Facebook you want to put the base URL in here and then choose the source, which is Facebook choose a medium.
Complete all of the exercises first to receive your certificate!
Share This Course