As the digital advertising industry increases in complexity, the most successful publishers leverage data as a strategic advantage. Publishers who use data most effectively generate more revenue from their inventory, use fewer resources to manage their ad operations, and make their customers happier.
However, harnessing the power of data grows more challenging every year. Every year new channels emerge and changing customer behaviors result in additional points of contact with customers. To keep up, publishers are investing more resources into aggregating and analyzing data. And as the associated volume of data grows, maintaining data quality becomes an enormous challenge.
Three key areas to consider when assessing data quality in your organization:
Without data that satisfies these three criteria, users lose trust in the data. For publishers, this is a big problem since many teams rely on data to do their jobs effectively. Here are just a few examples on both the programmatic and direct side of the business where data quality is key:
When users don’t trust the data, they spend enormous resources identifying and correcting erroneous information manually. Publishers aren't alone in this. More than half of businesses spend more time cleaning data than using it.
The difficulty of maintaining quality data presents a big opportunity for those organizations who choose to tackle it. By understanding the challenge and developing a process for maintaining data quality, publishers benefit from more accurate insights, more efficient teams and a significant competitive edge:
Advertising data is fragmented and getting more so every day. We’ve seen the average number of data sources in our customers’ ad tech stacks grow from 16 to 29 from 2015 to 2018.
Even if you streamline your ad tech stack, it’s inevitable that teams will continue to swap systems in and out, introducing more complexity every year. Again, drawing on our own experience, we’ve not only seen the number of sources grow, but we have seen the turnover in data sources in our customers’ ad tech stacks significantly increase in recent years.
The result is that the analytical landscape for publishers is becoming more complex. Publishers collect huge volumes of data from different sources, with no consistent data structures. To make the data useful, teams must aggregate information from differently structured data silos and merge it into a consolidated repository with a consistent data structure.
That’s easier said than done since different vendors supply data in different ways.
For example, each SSPs reporting system is complex, and there’s no guarantee they align with the reporting formats of other sources. Some platforms are limited in their capabilities while others are extremely sophisticated. Pooling data often lowers the standard to the level of the most limited platform or data source.
Operational reporting requires constant data refreshes since vendors change their numbers all the time. We’ve observed that the leveling period (time until data stops updating) for AdX is about 10 days, and for DFP it’s around 21 days.
In addition, vendors are constantly tweaking their platforms and reporting structures. Yes, they’re improving their technology, but that also means the data (and how it's presented) continues to transform. Scaling up those changes to multiple vendors means teams are having to constantly adapt in real time.
No business relationship is static. Publishers not only change the way they work with their third-party vendors, but they also change vendors all the time. These vendors are chosen for their revenue potential, rather than the ease of reporting or quality of their data. And the expectation from management is that these new vendors should integrate seamlessly with the existing setup.
The success of publishers depends on their willingness to face these challenges head-on with strategic vendor partners. Because, as difficult as it can be to create a data quality assurance program, the alternative is worse. Without a data quality program that inspires confidence across the organization, users will waste time trying to fix the data themselves and the organization won’t benefit from data-driven decisions. Poor data quality can cost organizations between 10 to 20% of their revenue.
The good news: just as there are real costs related to poor data quality, there are big gains for companies that strengthen processes for clean data.
While it’s possible to build a data quality program independently, this approach is very expensive, time-consuming and fraught with false starts. Any solution put in place to address data quality that doesn’t gain trust will result in people reverting back to old methods and bad habits. And failed data quality programs make it harder to earn the organization’s trust in the future.
At Burt, we leverage ten years of experience partnering with publishers of all sizes to transform data into a strategic advantage. With intelligent technology and a nuanced understanding of the shifting industry landscape, Burt gives publishers an efficient and effective platform with clean and consolidated data. We integrate across your entire ad tech stack to enable smarter decisions while saving countless hours of manual, repetitive work.
Our solutions allow teams across the business to get self-service, immediate answers to important questions. Plus, the solution evolves at the speed of business, not at the speed of the IT department.