How to explain the discrepancies in analytics data between and across platforms

By leveraging various digital channels companies are trying to reach out to more customers over larger geographies to increase their businesses using multi device analytics. The effectiveness of these digital campaigns is measured by the various measures like Adwords, downloads, and app-install numbers. From an array of metrics,marketers and advertisers get a clear picture of the effectiveness of the campaign.

Most organizations are moving towards a culture of data-driven decision making. The data sets they lean on are spread across multiple platforms and sometimes similar attributes need to be collected and analyzed to become the inputs for strategic decisions. In such an environment, any error, discrepancy, or gap could derail the business strategy.

But what causes these data discrepancies?

Tracking code:

This is by far the most common reason for the mismatch of data across platforms. The tracking code can be set up on the product directly or tools like google tag manager. Whatever the approach,marketers must ensure that the code is set up correctly to avoid this problem. There are various Tag assistant plugins available which will notify or alert in case of duplication or erroneous placement of the tracking code tag. There are SDKs available for almost all the operating systems that can be leveraged to place the tracking codes correctly.

Time Zone settings:

Different analytics tools have different time setting by default. Some use the local time setting and others use Eastern time in the USA as the default. Needless to say, this can cause differences in the data point count. An e-commerce organization that needs to check the quantity of a product sold in the last 24 hours will see different data discrepancies on different tools. Hence it is of primary importance that the default time zone setting across the tools be made consistent organization-wide.

Traffic filtration:

Platforms like Google use a few underlying principles to filter traffic. Organizations pay Google based on the number of clicks. Attempts by a single user to click a page multiple times to increase the number of clicks is not unknown. Of course, this creates waves in the data sets. Some tools can weed out such clicks. That apart, it is also not uncommon for exclusion and inclusion filters to be applied with inconsistent logic. This adds to the problem at hand. In the interest of maintaining data sanctity, the same principle must be used always to avoid data differences across platforms.

Data Sampling:

Erroneous data sampling can result from time-zone settings, tracking code allocation errors to different tools being used for sampling. If the sample size is small, then this is not much of an issue but if it is large then the analysis can suffer. To avoid such errors, one has to change the relevant parameters. For instance, Google does sampling when there are more than 500K sessions. If discrepancies are sighted,then adjusting the precision/speed ratio or by changing the time duration can help address the issue.

Attribution Models:

Organizations can’t do much here. Problems with attribution models across various platforms account for anywhere from 10 to 15% deviation. This deviation is due to the gaps in the metrics used to measure the conversion and associated attributes. Cross-device conversion questions add to the issue. Even though not much can be done about it, marketers need to understand why this happens so they can allow for that in their own campaign analyses.

Cross-device conversions:

In our multi-device age, there will be data discrepancies across platforms if customers click on an ad on one device, say a smartphone, and then complete the conversion on another, say a desktop. Traditionally, some platforms like Facebook have been quite accurate in reporting such conversions as they tie the calculation on the user journey and do not have data dependencies on using cookies for tracking, not that robust a methodology. Google too has come up with a system for accurately tracking cross-device conversion. That said, the complexity adds its own issues.

Clicks and Devices:

This is a coming together of the errors thrown up by attribution models as well as cross-device conversions. It is important to understand how clicks by the same user on multiple devices work and a robust tracking methodology can help arrest that problem.

Single source of truth:

It’s time for marketers to ask,Are we looking at the same report?” to understand and arrest the problems. Different reporting tools collate data differently and data discrepancies arise due to problems like inconsistent geo location setting, or time zone setting. Attribution models, cross-device conversions, and sampling issues also further builds the gap. Different reports also have different purposes and the same data might be represented differently and hence it’s important for marketers to know whether they are looking at the right report for the right purpose. We cannot emphasize enough, the importance of making data consistent across various platforms. In essence, the data can make or break an organization’s strategy. Being mindful of the factors outlined here can enable marketers to build in tolerances for them. This will help prevent some of the detrimental effects of the disparities and help them arrive at more conscious, grounded digital strategies.

We surveyed over 450 industry leaders about the challenges & opportunities they foresee in the CookieLess World.

The report has the insights & recommendations for you to be prepared for the changes to come.