Digital economy has changed us forever. 20 years ago, if you opened a new storefront your competition did not have your price book and catalog instantaneously. Promotions were printed, you made deals in person, and transactions were very private. In the travel industry, you might have stationed someone at gates to count boarding passengers, which made it costly and laborious to capture all the competitive data and even more effort to make such information actionable. Well, that has all changed in the digital economy because we serve all our competitive information right to the bots, web scrapers, and crawlers.
If you stop feeding your web pages to malicious bots, scrapers, and crawlers you can improve SEO results, protect your business model, reduce legal risks to your business, and save on operational costs.
See if you can relate to some stories from the frontlines.
One eCommerce store has seen more than 50,000 SKUs worth of product content scraped from their website and this was used to create a website full of counterfeit goods. Worse still, the search engine results page (SERP) positioned the new website with counterfeit goods higher. How do you think the responsible manager is doing as they enable fraud, lose customers and decrease revenue? Imagine how the manufacturer feels about having distribution partners enabling counterfeiting operations.
One travel site owner sits across the table from their affiliate “partner” who is scraping their site rather than using the partner API. The partner avoids the fees charged for the API by simulating an end user in the form of a malicious bot, then charges a real end user the same transaction fee, which was circumvented by using the bot rather than API. The so-called partner is brazenly (1) taking more margin, (2) endangering the travel site’s ability to maintain its business model with compliant partners who do use the API, and (3) owning the customer interface which takes away cross-selling opportunities for incremental revenue.
Recently in the New York Times, Ticketmaster (a LiveNation company) is suffering from bots automatically ordering tickets in huge volumes for all the premium seats. Then these bots put these tickets on secondary market at a premium and take margin on that. Needless to say that has a very real and negative affect on consumer experience and negatively affects the business model and margins for Ticketmaster.
Bots are not real paying customers. It is that simple! Why feed them your content or your pricing data? Why should you let them transact through your site designed for humans? Bots will eat your business, chew your margins, and lure all your customers away if you continue to feed them your content and data.
Protect Intellectual Property (IP) of assets and branded images/content
- Enablement of counterfeiters (use genuine artwork to sell knockoffs)
- Reduction in SEO value (less unique once assets are widely distributed)
- Violation of marketing agreements (breach for failure to protect IP, to meet minimum guarantees, to control distribution, etc.)
Protect your business model
- API circumvention (are bots scraping your API?)
- Loss of portal traffic for advertising and cross-selling (who owns the customer experience?)
- Loss of customer (where does customer transact?)
Reduce operational costs
- Lower bandwidth and server loads since malicious bots are typically 7% to 32% of web traffic on most sites, excluding “good” search bots
Feeding the malicious bots, web scrapers, and crawlers your web pages is like writing blank checks for your competitors, counterfeiters, and other fraud operations. Don’t let bots run your business – regain control.
About the AuthorFollow on Twitter More Content by Charlie Minesinger