Distil Networks has been working with 614 Group, a digital advertising infrastructure consultancy that provides education and resources to the digital publishing industry, to pin down the true scale of the non-human traffic (NHT ) problem, otherwise known as bots. At the group’s Brand Safety Summit in New York on November 17 this year, Distil’s Rami Essaid and myself presented the current status of the fight against NHT.
A quick summary of the presentation follows. If you want to watch a full recording of the event, please see below.
A $7 billion problem
Business economics dictate that the burden of proof for understanding and dealing with the NHT problem is on publishers, but 82% admitted during the initial phase of the research that they have no insight into what’s going on – or how to stop it. That’s a pretty big problem when you’re talking about $7 billion in lost revenues across the industry – especially when individual publishers have no way to calculate how much of that loss accrues to their business, their campaigns, because they have way to measure it.
Impact on publishers
Distil started out on the advertiser side of the equation, where fraudulent bot behavior was more easily defined. When each bot is responsible for hundreds, even thousands of ten-cent clicks or fraudulent e-commerce transactions, the money adds up fast. When advertisers are walking away from publishers because they can’t trust the analytics, the industry needs to get serious about the problem from the other side.
Bots are creating uncontrolled synthetic audiences that are being sold elsewhere for a lower rate, leaving the publisher with the bill but not the revenue. It’s reached the point that agencies are capping the tolerable level of NHT as low as 3-5%. Clearly, explaining away that redirected traffic after the fact is no longer cutting it with advertisers or agencies – it’s time to stop relying on audits and turn instead to blocking ads from being served into fraudulent environments.
To effectively block bots from siphoning off legitimate audiences and revenues, you have to go beyond the cookie and IP addresses to track the device and the browser. To see how accurate we could get in tracking NHT, we ran a study using the different methodologies, which produced the following results:
It is that insight into, and blocking of, that final 16% that we’re focused on in this study.
Auditing tools are built for the entire advertising ecosystem. Publishers need something that will give them back control over the traffic, enable them to block or allow it, and provide the transparency advertisers demand – for every single request, every single cookie, and every single user. And that’s what Distil provides - a reverse proxy to block traffic that doesn’t pass the NHT sniff test. Because the page is never served to that traffic, the bot can never synthesize the audience, skew the analytics, or steal clicks.
Join the research!
Want to learn more about how you could be charging a premium for bot-free inventory and protection for end users against retargeting campaigns? Join us in our research by dropping me an email to firstname.lastname@example.org – the more traffic we can analyze, the more precisely we can identify and block NHT from everyone’s sites and recreate a level playing field in the digital advertising business.
About the AuthorFollow on Twitter More Content by Charlie Minesinger