Bots -- which some estimate make up about 46% of Web traffic -- scrape the Web for whatever bits of content they are programmed to find, often mimicking human behavior. But what does it cost the company developing the content, analyzing the numbers, and creating the information for consumers and customers in hopes they will visit their Web site?
Original content and data really mean much more, especially when it comes to search engines like Google, Bing and Yahoo.
The real estate and travel industries see a majority of bot activity, mostly based on price comparisons, but in this past year real estate had the highest percentage of bad bots at 32%. From 2014 to 2015, the real estate industry saw a 300% increase in bad bot activity, copying the price comparison model of the travel industry. Simultaneously, travel industry leaders saw that 48% of their traffic in 2015 was bad bots, according to Distil Networks, which published a study this week titled "The 2016 Economics of Web Scraping."
Aside from real estate and finance, industries such as digital publishing, e-commerce, directories and classifieds, and airlines and travel took a big hit. I'm betting that as the need for content rises, so will Web scraping. If a Web site contains content that drives revenue for a business, that business is at risk.