Picture a house built on the side of a cliff. As the cliff weathers over the years, if no action is taken to shore up the foundation, eventually the house will fall off the cliff. The lesson? Do a little maintenance. Ward off a catastrophe.
What’s this got to do with data?
Without attention, data quality can erode.
What erodes data quality are deteriorating processes. If left unattended, your digital analytics data quality (and your organization’s faith in analytics) might suffer a slow death by a thousand cuts. The catastrophe? Possibly getting fired.
Here are some examples of processes breaking down, weakening data quality.
- A form is changed on a website. The work was outsourced. The developer fails to include the tag container on the new form pages. Not only is the form not measured, visits fracture, and analytics now shows self-referring traffic. Visits go up but average visit duration declines, as does the fraction of new visitors.
- The media centre is redesigned. The method embedded YouTube videos are deployed is “tweaked”. Integration with YouTube no longer works properly. Video completions drop to zero.
- Organic search traffic jumps. Content has not been changed. Backlink profile does not appear to have changed. Because of “not provided”, it is difficult to judge which search queries are triggering this traffic increase. Cheers result because clearly something has been improved! However, Webmaster tools does not show an increase. This continues for a few weeks, until it is discovered that an agency that was thought to only be running display ads is also running search ads, which are untagged. Therefore, rather than paid search, this traffic is showing up as unpaid, organic traffic.
Eventually if this continues, nobody will rely on the data for any useful data at all, let alone insight (i.e. the house falls off the cliff). Digital analytics will be viewed as failure and waste of time.
To prevent this from happening, somebody needs to be assigned ownership. This owner:
- Must put systems and processes in place to proactively prevent situations that cause an implementation gap.
- Cannot delegate the process to a tool. Expecting that subscribing to a tool to automatically audit data & implementation quality an abdication of responsibility.
To assess your risk of data quality failure, we suggest asking yourself a few questions about data quality governance:
- Who’s neck is on the line when a problem such as the situations above arise? If nobody’s neck is on the line, you don’t have an owner that is accountable and responsible for data quality.
- Which role/position is responsible for preventing problem situations from occurring? All too often, the current responsibility for digital data quality resides with a knowledgeable person and is not written into the role/position description. I.e. this person is the process and the owner. If the person moves to another job, governance and maintenance stops, gaps occur and a data quality cleanup project ensues 6-12 months later.
- Do projects contain checkpoints to assess the impact on data quality and/or check that the measurement can actually occur? Is measurement and analysis a required deliverable of digital projects? Who is accountable for ensuring these checkpoints occur or that measurement of success outcomes does occur? Is data collection quality checked while the new site/microsite/app is in the test/staging stage?
Because every organization is slightly different, boilerplate governance frameworks will not work. Governance needs to make sense, or it will not take root or be effective. Every organization needs to think this through so that it does actually make sense. However, what is absolutely necessary is an owner (neck on the line), who has the resources (budget) to integrate and maintain a process that will ensure useful data quality, making your analysts and managers smile?
What do you think? Is this enough to get started, and stop the erosion?
* Image credit: The New Yorker Anti-Caption Contest 277