White Papers

Aberdeen Automated Attacks on Your Website

Learn about the changing threat landscape of bots and botnets. Our bot detection & bot mitigation white papers show how to address them with Distil Networks.

Issue link: https://resources.distilnetworks.com/i/734487

Contents of this Issue

Navigation

Page 1 of 6

www.aberdeen.com firewalls, and advanced bot detection and mitigation solutions); and the annualized total cost of detecting and mitigating bad bots under each respective approach. Quantifying the Risk of Bad Bots, Across All Websites In an aggregate analysis of all website traffic across all industries, based on an annual contribution of $100M in revenue and a total of 100K to 1M data records that could potentially be compromised, Aberdeen's Monte Carlo analysis yields the following insights about the risk of bad bots: As a percentage of annual website revenue, the median annualized risk of bad bots under the status quo (manual blocking) is estimated to be about 4.2%, with an 80% confidence interval between 1.6% and 7.9% Quantifying the Risk of Bad Bots, in Selected Website Categories Because the percentage of web traffic attributable to bad bots varies by industry, Aberdeen also leveraged the findings from the 2016 BBLR to estimate the risk of bad bots for the following website categories: In Airlines and Travel — for which Southwest, American, Delta, United, British Airways, TripAdvisor, Expedia, Priceline, Travelocity, Kayak, and Orbitz are illustrative examples — the 2016 BBLR found that bad bots represented 7.0% of all website traffic. In Digital Publishing — as exemplified by Yahoo, MSN, ESPN, and CNN — the 2016 BBLR found that bad bots represented 31.3% of all website traffic. In Directories and Classifieds — for which Google, Bing, Yahoo, Ask, and Yelp are illustrative examples — the 2016 BBLR found that bad bots represented 16.4% of all website traffic. Learn More About Bad Bots • For technical details and trends on the origin, behavior, capabilities, and evasion techniques exhibited by bad bots, readers should refer to the empirical data available in publications such as the Distil Networks 2016 Bad Bots Landscape Report. • In 2015, the Open Web Application Security Project (OWASP) community initiated the OWASP Automated Threats to Web Applications Project, which brings together research and analysis of automated attacks against web applications (i.e., bad bots) to help website operators defend against these threats. • The OWASP Automated Threat Handbook for Web Applications proposes a common language for automated threats, designed to facilitate better communication between the many stakeholders looking to address these issues.

Articles in this issue

Links on this page

Archives of this issue

view archives of White Papers - Aberdeen Automated Attacks on Your Website