How Bots Degrade the Marketing Tool Ecosystem

June 1, 2016 Peter Zavlaris

Advanced persistent bots (APBs), capable of wreaking havoc on marketing tools, are the most common bad bot type, according to the 2016 Distil Networks Bad Bot Report. They possess a wide range of capabilities, including the ability to evade detection, impersonate humans, change identities, and vary attack patterns.

Marketing tools track humans. Through impersonation, APBs cannot be accounted for—they simply blend in. They distort visitor tracking and reporting, rendering critical business marketing tools ineffective.

JavaScript and tracking

During site visitor sessions, marketing tools use JavaScript to perform tracking by injecting analytics tags/tracking pixels and cookies into users’ browsers. Any ad click or transaction is tracked and reported in this manner.

The following data typically gets routed to databases owned by the marketing vendor:

  • Visitor information – operating system, origin, new or returning visitor
  • User interactions – pageviews, time on page, clicks
  • Conversions – web forms filled out, content downloaded
  • Social media interactions – likes, retweets, reposts

These data are then analyzed, parsed, and usually fed into reporting tools.

APBs configured with tools like Selenium, PhantomJS, and SlimerJS load JavaScript by default. This triggers the tracking mechanisms listed above and skews metrics.

Performance and Analytics

Skewed Analytics

Performance metrics are all impacted. Making matters worse, nothing is in place for marketing teams to manually check their site traffic quality. Gauging the impact bots have on performance metrics is guesswork.

Well-known marketing performance and analytics tools include:

  • Analytics and tracking – Google Analytics, MixPanel, Adobe Marketing Cloud, and Kissmetrics
  • Performance and attribution – Bizible, Adaptive Insights, and Beckon
  • Testing and optimization – Optimizely, On Interactive, and Webtrends

In considering the impact bots might be having on your marketing operations, ask yourself, “Which of these (or similar) marketing tools are in my stack, and what could it mean to my business if they’re inaccurate by 10% or more?”

What are the Consequences of Bots Distorting Your Data?

Here are several scenarios revealing how the presence of bots might be affecting your marketing tools:

What effect could skewed analytics have on your business?

Like most marketers, you probably review Google Analytics on a frequent basis. Perhaps you’ve noticed a recent spike of new users from Germany showing interest in your products. They’re spending more time on your site, viewing more pages, and checking out all of your products in detail. You can even tell from your heatmapping tool that they tend to view all page content, rather than just content above the fold. Should you invest more in the German market, or create German-language assets? Perhaps but not if the new users aren’t human.

How does skewed attribution data affect decisions?

Say you’ve been reviewing your lead attribution data to determine where to spend more of your marketing budget. It appears that people really like an asset promoted on your site. Unknown to you, the “interest” is really generated by a bot running rampant on your site. Being oblivious to this, you decide to invest in the creation of similar assets. Would it matter if no humans ever click on it?

What could a skewed A/B test mean to your business?

While A/B testing different colors on your website, you’ve determined that half of visitors prefer orange to blue buttons. APBs aren’t subject to the rules of psychology and have no color preference. Including their choices in your dataset could lead to decisions which damage conversions for real humans.

APBs and Advertisers

Advertisers need to provide accurate site performance information wherever customer ads are being served. False reporting costs them in terms of customers and partners—in turn affecting bottom lines. 

Advertisers rely on JavaScript to serve ads and redirect visitors to customer sites, so APBs impact their business model. As they crawl across websites, the bad bots click on ads, thereby driving up PPC costs.

On the flipside, advertisers are faced with artificially inflated session counts, high bounce rates, and few, if any, conversions. Bogus numbers can lead analysts to believe assets are underperforming, when in fact the publishing site has a bot problem.

How APBs Impact Publishers

Pay-per-click (PPC) breaks out into several categories:

  • Display & Native – AdRoll, AppNexus, Rubicon, DoubleClick
  • Video – YouTube, Wistia, Brightcove, Vimeo, Ooyala
  • Search – Google and Bing
  • Social – Facebook, Twitter, and LinkedIn

APBs impact publisher sites in numerous ways. Ad publishers typically have rich content (the typical bot target), and APBs interact with ads on these sites, too. They might be attacking at scale, throttling resources such as bandwidth, or overloading web serving infrastructure, thereby causing pages to load slowly or the site to go offline.

Bot perpetrators running scams send traffic to publisher sites, resulting in the perception of boosted site performance. Since APBs can perform user-like actions—including browsing pages, varying time on page, and filling out forms—high click-rates or conversion rates may result. However, these are mostly unusable leads or non-valuable traffic, so the publishing site’s reputation is damaged and ad revenue is lost. Learn more on about how web bots affect your KPIs and your bottom line.

How much do these problems matter to your marketing operations?

The APB Problem is Getting Worse

According to data from the 2016 Distil Bad Bot landscape report, APBs now make up 88% of the bad bots visiting the average website. Additionally, 2015 saw a shift if the focus of bot operators, where the software they used to attack websites greatly improved

The quantity of simple bots, which by definition are easier to contain, dropped by almost 11% and only accounted for 12% of all bad bot traffic in 2015.

The report also uncovered the fact that 53% of bad bots are able to load external resources like JavaScript and will end up falsely attributed as humans by marketing tools.

Percentage of Bad bots able to load external assets like JavaScript

StubHub uses Distil Networks to Keep APBs out of Marketing Tool Stack

After installing Distil Networks, StubHub’s page view count dropped by 50% and the site's conversion rate (number of unique visitors vs. number of buyers) was more in line with expectations.

"Like most ecommerce companies, we’re always looking at our conversion ratio. Our conversions were greatly deflated by bad bot traffic. Now that we’re filtering that out, we can see the real data and make more informed decisions,” Marty Boos, StubHub’s Senior Director of Technology Operations.

Learn how StubHub uses Distil Networks in this video testimonial with Marty Boos and Distil CEO Rami Essaid.

Previous Article
Distil Networks Partners with Online Trust Alliance (OTA) to Make Internet More Trustworthy
Distil Networks Partners with Online Trust Alliance (OTA) to Make Internet More Trustworthy

97% of websites non-effective in combating against Advanced Persistent Bots (APBs) : Online Trust Alliance ...

Next Article
Distil Networks Turns Five – Where Did the Time Go?
Distil Networks Turns Five – Where Did the Time Go?

In five years, Distil Networks has grown from three people in an apartment to a multinational organization....