Why do we accept dual standards? How can they persist in a world of transparency and pervasive technology? Really it is startling to think about what is acceptable online compared to what is acceptable offline. Sometimes it is a useful exercise to think about our online world and then consider what some of the analogous brick world equivalents would be.
If I leave the front door of my house open by accident, have I waived my property rights for all my belongings? If someone walking by my house sees my dining set or some furniture they like, can they take it? What about my TV, ipod, etc.? No, it is still mine, don’t try it, please.
What if I pay for 10,000 mailers to be delivered to a geography and the mailers are simply not delivered, are partially delivered (but too late), or are not delivered to the geography I paid for. Clearly we would all agree I should get my money back (or someone got one by on me!).
So what about my website? Can any visitor simply violate my terms of service and scrape my content? I have copyright laws to still protect my IP and I have terms of service, but it seems it is somehow less clear legally that because my digital property is easily copied and redistributed it is somehow less protected. Seems unfair that the burden is now on content owners to defend their property rights.
The big issue online is that bots (automated user agents) are rampant. Automated agents are clicking on ads, stealing price information, and stealing content. We do not have life-size robots scanning/copying newspapers, subscribing to magazines to see ads, nor browsing shoe stores, etc. Why would pay for ads in magazines read by bots? Why would I pay for bot traffic? Why pay for bots clicking on ads? Especially when the technology exists to detect and manage bots is here and it costs pennies in CPM to ensure your traffic is bot free?
Anti-bot services for websites will be as common as firewalls and antivirus software. It is inevitable! Everyone will have to know which of their visitors are humans or bots.
So, the false dilemma for digital publishers is that they can choose whether or not they guarantee their traffic is 99.9% bot-free.
The good news is that evidence is in and 99.9% bot-free traffic is better for publishers. Distil will share some data in an upcoming Case Study that will provide detailed evidence that pure traffic is more valuable. Moreover, publishers who are 99.9% bot-free shall be rewarded with better results for every metric of online marketing performance. Bold claims, no, just more big data.
About the Author
Charlie Minesinger is the Director of Sales at Distil Networks focused on strategic accounts and channels. Charlie has over a decade of experience selling into enterprises and brand accounts, including Sprint, Disney, Nortel, Amazon, Cisco, AIG, Mattel. Charlie brings with him experience in start-ups and selling into new markets.Follow on Twitter More Content by Charlie Minesinger