The NSA Should Have Known About and Prevented Snowden

February 10, 2014 Rami Essaid

  Snowden2  

As George Bush so eloquently said, “Fool me once, shame on — shame on you. Fool me — you can’t get fooled again”.

Well, that’s exactly what happened with Edward Snowden; the Government got fooled again and shame on them! Snowden used basic web crawlers, or bots, to harvest millions of files from the NSA reminiscent of Chelsea Manning’s breach just a few years earlier.

In 2010 Manning leaked classified information to WikiLeaks to the order of hundred of thousands of reports, cables, and classified videos. How did one person get ahold of all of this sensitive data? She used a simple computer program to download the files. You see, Manning already had access to the files in question but no one person could efficiently get all those reports on their own. Instead, the bot using Manning’s credentials could harvest that information in no time at all. The bot in question, Wget, is one of the simplest bots that is nearly 20 years old and could have easily been prevented.

Lesson learned, right? Nope. Fast-forward three years and Edward Snowden has now been revealed to have used similar techniques. The software used by both Snowden and Manning was “hardly sophisticated and should have been easily detected”.  The United States’ intelligence and defense agencies, the organizations charged with protecting our country and its secrets, were fooled TWICE by such a simple program? If I saw this as an episode in the TV series “Homeland” I would laugh and say it’s a plot hole and implausible. To see this in real life is beyond laughable- it’s scary.

Officials with direct knowledge of the incident could not explain why a web crawler was allowed to index and search through so many NSA classified documents considering “web crawlers are almost never used on the N.S.A.’s internal systems”. Why are there not the technological checks in place to identify bots across the internal network where they have no business being? While stopping bots isn’t trivial, this is a problem that has been worked on and addressed by many companies including Distil Networks.

Not every bot is created equal. Some sophisticated bots burry themselves as malware deep inside the host computer and emulates real user behavior. These types of bots require sophisticated tests to look for embedded code. Often, new zero day bots can only be detected with advanced behavioral models that find anomalies to normal usage.

Snowden and Manning however used the most basic bots – bots that we catch 100% of the time with ease. These bots don’t require data scientists to write complicated machine learning algorithms to catch. Instead, a simple Cookie or Javascript validation would have very easily detected them and alerted someone that something is amiss. Not to say the NSA should be using Distil, but they should be doing something! The two worst data breaches in our nations history could potentially have been prevented; bots are a serious security threat and something has to be done.

In an ecosystem of secrecy and confidentiality, introducing checks and accountability may feel counter to operational realities. Is it too much for the agencies to address the technical risks inherent in the status quo? I hope not.

In either case, every journalist I’ve read has echoed one question “why did this happen?” Until the people in charge pull their heads out of the sand, the question we should instead be asking is “what agency will this happen to next?” 

About the Author

Rami Essaid

Rami Essaid is the CEO and Co-founder of Distil Networks, the first easy and accurate way to identify and police malicious website traffic, blocking 99.9% of bad bots without impacting legitimate users. With over 12 years in telecommunications, network security, and cloud infrastructure management, Rami continues to advise enterprise companies around the world, helping them embrace the cloud to improve their scalability and reliability while maintaining a high level of security.

Follow on Twitter More Content by Rami Essaid
Previous Article
Wgetting How People Scrape
Wgetting How People Scrape

Web scraping is often defined as an accretive process. The idea is that the value of the data only comes fr...

Next Article
7 Ways Bots Hurt Your Website Security
7 Ways Bots Hurt Your Website Security

Bad bots are a common problem, and can legitimately ruin your website security. Common problems are content...