Advanced Persistent "Bad Bots" are Rampant

March 17, 2017

In 2016, 40% of all web traffic originated from bots -- and half of that came from bad bots. A bot is simply a software application that runs automated tasks over the internet. Good bots are beneficial. They index web pages for the search engines, can be used to monitor web site health and can perform vulnerability scanning. Bad bots do bad things: they are used for content scraping, comment spamming, click fraud, DDoS attacks and more. And they are everywhere.

Findings from Distil's 2017 Bad Bot Report (PDF) released Thursday show that the problem is rising again after a brief improvement in 2015. In 2015 bad bots represented 18.61% of all web traffic. This is down from 22.78% in 2014, but has risen to 19.90% in 2016. These figures come from an analysis of hundreds of billions of bad bot requests, anonymized over thousands of domains.

Bad bots especially target web sites with proprietary content and/or pricing information, a login section, web forms, and payment processing. Ninety-seven percent of websites with proprietary content and/or pricing are hit by unwanted scraping; 90% of websites were hit by bad bots in 2016 that were behind the login page; and 31% of websites with forms are hit by spam bots.

 

Read the Article

Previous Article
Distil buys Are You a Human, launches free bot detection plug-in for Google Analytics
Distil buys Are You a Human, launches free bot detection plug-in for Google Analytics

Next Article
Annual Distil Bad Bot Report: 96% of Sites with Login Pages at Risk
Annual Distil Bad Bot Report: 96% of Sites with Login Pages at Risk

Today, Distil Networks published its fourth annual bad bot report, which found that malicious bots attack 9...