Geoff Ruddock

Tracking: Organizational Challenges

There are plenty of technical guides online about tracking user behaviour using GTM. But I haven’t found as much about dealing with the organizational challenges that may arise when making changes to tracking.

One of my main projects at Carmudi was improving our tracking. The key challenge was that I was not building tracking entirely from scratch. We already had a buggy tracking implementation that was feeding data into some of the most important reports in the organization. Stakeholders get nervous when you propose changes to tracking, even if tracking currently sucks.

As a product manager, my primary interest in tracking is to feed higher-quality data into the product decisions my team makes. Being “data-driven” is chic, but having reliable and relevant data is not a given. It requires some strategic forethought to track the right things and track them properly.

The first thing I did was consolidate all the country-specific containers into a single global container in GTM. Our application is nearly identical between countries, so this was easy from a technical perspective. We removed outdated tags, replaced country-specific IDs with lookup tables, and updated triggers to match. The second major change was change how we name events to communicate user behavior in a more transparent way.

A few lessons learned from the process:

Reports are fragile

Tracking data feeds into many teams’ reports—some of which you may not be aware of. These reports can be quite fragile to changes made to the tracking layer. Even worse than breaking a report, is to subtly impact some of its underlying assumptions, reducing the accuracy and usefulness of that report without anyone realizing it.

The best way to mitigate this risk is to coordinate tightly with BI. Sit down and trace all the “customers” of tracking data to get a better sense of how changes will impact various teams and reports. It is especially important to be aware of which reports are consumed by external stakeholders such as investors. These reports often process the data down to a single number in a spreadsheet cell, without any context around it. For example, inserting a GA event could impact the “bounce rate” calculation on that page.

People are overly confident in their data

Making decisions on real-world data is not as clean-cut as a case study in business school, and it is always good practice to question the source and validity of the data you are using to make a decision. Unfortunately some decision-makers can lose sight of this. Prepare for some push-back against your proposed fixes or improvements to tracking, as this implies that prior decisions were made with flawed data. Data is never infallible, but this can be an uncomfortable reality for some managers.

Decouple tracking from KPI definition

The ideal tracking event crisply describes the nature of the user interaction without commenting on the value to the business. Event names such as “Unique Lead” or “Customer Intent” are opaque and give no visibility into what exactly those actions are, or why they are important to the business. It is better to push the task of KPI definition “up the stack” to management, so that the people who are ultimately consuming the tracking data will be better-equipped to make decisions on it.


comments powered by Disqus