Information Is Power: Investing in Your Analytics

July 17, 2014 Ron Abisi

Better Business Decisions from your Server Log Analytics through Bot Protection

binary
Photo credit 

For enterprises, maybe the most important way to determine where to make investments and changes in infrastructure, websites and advertising budgets is through server log data analysis.  This data is core to the business and will dictate budgets across departments.  Businesses spend millions of dollars on log analysis tools, digital ad campaigns, servers, bandwidth, web development, and engineering resources.

Understanding how to allocate budget is becoming more and more difficult with the alarming increase of bad bots on the internet, which carry out web scraping, botnet attacks, data aggregation, click fraud schemes, and hacking attempts, to name a few.  Good bot traffic, deployed by SEO and social giants Google, Yahoo, Microsoft, Facebook, Twitter, etc., is imperative to successful websites.  The need to identify human traffic from good bots and bad bots is vital to every company for intelligent decision-making.  To interpret log data, companies rely on a host of data analytics tools- for example, Splunk™.  Tools like Splunk ™, “the leading software platform for real-time Operational Intelligence”, are great for interpreting, summarizing, and reporting machine data. Unfortunately, these tools are lacking in true bot vs. human detection, not to mention they don’t give you much insight into where your traffic is coming from and what people are looking at on your site.

The best solutions will provide your business with true, real-time bot detection and mitigation with the ability to integrate log data into your log analysis tools. While bots and humans should be looked at differently, it is important to use the same logical analysis of who, what, when, and where before making critical business decisions about budget allocations. What exactly does this mean? It means that with the right bot detection and remediation tool you can identify:

  • Who is sending the bots to your site - Is it a competitor, aggregator, or search engine deploying them?  With this info, you can decide whether you want to block these bad bots from infesting your resources and log data or simply monitor them and make reactive decisions.
  • When a bad bot is masquerading as a good bot – Most bot solutions cannot decipher bad bots masquerading as good bots because they rely on things like IP address recognition.  Anybody can mimic his or her browser’s user agent to replicate a good bot like Googlebot. For this reason, if you value the content on your site and wish to protect your business, it is crucial to decipher the true identity of a bot like Googlebot in real-time.  Knowing who is attempting to compromise your information helps your organization from a legal standpoint, if action is needed.
  • What type of content on your site bots are targeting - Recognizing this information will help engineers understand how to better optimize the site.  Are bots selecting certain inventory more than others?  All of this data will help the organization determine which digital campaigns and agencies are better than others.
  • The times of day your site is most frequently impacted by bots – Once you can see which traffic is human, you understand where and when humans are genuinely interacting with your site. This is very important to your marketing teams efforts in selecting campaigns by time and geolocation. Perhaps your engineering team relies on a hosted solution, and between 5:00pm and 8:00pm ET bots are hitting infrastructure the heaviest.  Instead of spinning up additional resources, outstanding bot detection allows your team to mitigate any bots attempting to overrun your system and slow application servers to a point where the site becomes unusable by legitimate customers. You may find that 10% of your traffic is coming from the UK, but what if 80% of those requests are bad bots? Would you allocate more of your marketing budget on that region if you knew it would be wasted on bots?

All of these data points will allow your teams to make more informed conclusions on how humans in addition to good bots and bad bots interact with your website.  Looking at a single vector will only provide limited intelligence.  Without a real-time, inline, human vs. bot detection solution, any data will be misrepresented and misinterpreted.  Distil Networks has worked hard to put together bot detection that provides you with all this information, and, in business this information powers and guides the most important decisions your company faces on a day-to-day basis.

For more information on how comprehensive bot protection can empower your company, please contact us for a brief discovery call to learn if Distil Networks is the right partner for your organization. 

About the Author

Ron Abisi

Ron Abisi is the VP of Sales focused on direct sales for enterprises and high-growth Internet properties. He has over 15 years of sales leadership and technology experience. Prior to Distil, he worked for Dyn managing some of the largest and most well-known Fortune 500 and Alexa top 500 accounts.

More Content by Ron Abisi
Previous Article
5 Bot Mitigation Techniques to Try on Your Ecommerce Site
5 Bot Mitigation Techniques to Try on Your Ecommerce Site

Bots can pose a big threat to an ecommerce site, giving competitors access to proprietary product and prici...

Next Article
Distil Networks: One to Watch
Distil Networks: One to Watch

Network World has released its 10 Security Start-ups to Watch report. Distil Networks was chosen because of...