Congress Tackles the Internet

September 24, 2014 Jonathan Bailey

 Plagariam Today - Plagiarism Law 

Note: Today’s column is a guest post from Plagiarism Today, a site targeted at Webmasters and copyright holders regarding the issue of plagiarism online. 

Over the past year, Congress has been paying increasing attention to the Internet and, more importantly, whether the laws that govern it are in need of updating.

In the Senate, the Subcommittee on Crime and Terrorism recently held a hearing about the threat posed by botnets, Plagiarism law, and how to best address the issue. Distil was at the hearing and has a full recounting here.

In the House, copyright is the main focus as the House Judiciary Committee has held a series of hearings as part of its “comprehensive review” of copyright law. Those hearings, so far, have focused on a variety of topics ranging from music licensing to the notice and takedown provisions of the Digital Millennium Copyright Act (DMCA), which govern when and how hosts are required to remove allegedly infringing materials.

The copyright hearings will continue into 2015.

The reasons for the review are obvious. First, the laws that govern much of the Internet are already showing their age. The Computer Fraud and Abuse Act (CFAA), the law that governs many “hacking” and Plagiarism law was originally written in 1989. Though it has been amended several times since, the last amendment was in 2008 and that only dealt with identity theft issues.

Likewise, the DMCA was written in 1998 (based on treaties signed in 1996) and it has not been significantly amended since. The DMCA itself was an amendment to the Copyright Act of 1976, which took effect in 1978, and remains largely intact to this day.

The other issue is that the threats legitimate users face online are evolving at a rapid pace. The Internet of 2014 doesn’t look like the Internet of 2008, much less the Internet of 1998. Botnets, for example, are larger, more powerful and more sophisticated than ever and they are being used for more advanced criminal enterprises.

An illustration of this is the Gameover Zeus botnet, which was brought down in June. It reached nearly one million computers and caused an estimated $100 million in financial harm. It’s main purpose was to execute a “man-in-the-middle” attack that captured victim’s credentials when they logged in to seemingly secure sites. One heist alone netted nearly $7 million when a regional bank was hit.

On the copyright side, Google is now getting one million takedown notices every single day, most of which are over piracy. This represents a 15x increase from just two years ago. This represents both a tremendous burden on copyright holders, who have to locate, prepare and file the notices and Google, which has to check for correctness and take action on the links.

However, as Congress looks to make these changes, it faces a daunting challenges. It will likely take at least another year in both cases to finish the review of the existing laws, after which it will likely take months or another year to propose any legislation based upon that review. After that, it could take another year or more to get that legislation passed.

As a result, any legislation that is passed will likely stem from a review of an Internet that looked very different than the one it took effect in. For example, if Congress began a review of a law in 2002, it likely wouldn’t have been finished passing the new legislation until 2005 or 2006, completely missing the birth of both Facebook and Youtube, which were launched in 2004 and 2005 respectively.

The Internet can pivot completely in a day, making the legislative process painfully slow and virtually ensuring that the laws that are passed are out of date. While Congress can try to be forward-thinking, even the best and brightest of the tech community have repeatedly failed to predict the direction the Web is headed next. To ask Congress to do better is unreasonable.

Instead, the best hope is that Congress will both understand the Internet and its limitations in legislating it. Only then can it hope to craft legislation that is effective, fair and long-lasting. Otherwise, it will be stuck in a constant cycle of reevaluation, which will always be one or two iterations behind where the Web actually is. 

About the Author

Jonathan Bailey is the creator and author of PlagiarismToday.com. Plagiarism Today (PT) is a site targeted at webmasters and copyright holders regarding the issue of plagiarism online. Jonathan Bailey has been a writer, webmaster, graphic designer, IT guru, and GOAT (Geek of all Trades). His education includes a BA in Journalism and Mass Comm, 2002, University of South Carolina, where he graduated with honors.

More Content by Jonathan Bailey
Previous Article
Web Bots: A Huge Threat to Auto Shopping Sites
Web Bots: A Huge Threat to Auto Shopping Sites

Though web bots are a threat to any site on the Internet, auto shopping sites are particularly vulnerable.

Next Article
Data Science Terms
Data Science Terms

With the rise of the data science movement, now not only does your company have marketing, sales, support a...