Net Neutrality and the Appellate Court: Reasons for Concern and Praise in FCC Defeat

10 04 2010

A recent ruling by the United States Court of Appeals for the District of Columbia regarding net neutrality standards raises a bunch of far-reaching consequences for Internet users. The court’s decision stated in no uncertain terms that the Federal Communications Commission does not have the authority to regulate how Internet service providers (ISP) restrict access to content that is accessed over their electronic networks. In essence, the court handed a victory to the big media companies in this country in so far that they can decide as private industry players what you and I view online.

If this potential power grab by media companies scares you, rest assured you are not alone. At first I was one of those who felt very uneasy about the effects this decision could have on my Internet surfing. However, the more I looked into the topic, the more I realized this issue has many, many sides to it.

It is likely a safe assumption that most people who have grown nervous with the Appellate Courts decision feel this way because of possible restrictions towards online content. This is a valid concern. I also do not want one company, such as Comcast, Verizon, Time Warner or any other Internet provider, to decide what I can and cannot check out online in my home on my time. Whenever public freedom to access information is restricted, there is a problem. My own ability to judge what I want to see, read or hear online should be my decision. If this recent court ruling erodes my ability to carry out this activity, I will be one unhappy camper.

When reading about the Appellate Court’s decision, I realized that there is a hypothetical elephant in the room that must be acknowledged. The elephant -–rather issue—is in regards to what the FCC can regulate. If the FCC does not possess the regulatory power for ISPs, then who or what does? Where are the safety measures, the checks and balances to power, that are meant to secure the interests of the public at large? Sure, some claim that as a private industry, self-regulation will ensure that the public is treated fairly when it comes to Internet access. However, we have seen numerous examples of corporate debacles stemming from deregulation activities in this country and the serious ripple effects the public experiences when they occur. Considering the FCC has been sidelined in terms of regulatory activity on the Internet, this leads to my next point.

The justice system that exists today in the United States is overburdened in many ways. Political leaders, judges and citizens have cried foul over the length of time it can take for court cases to be decided in this country. If there already is a problem of case overload in our judicial system and the FCC has now been ordered to stand down from regulating Internet providers, who then will have to step up to the plate and take on these regulatory duties? It would seem that by default, the United States’ courts are set to become the go-to entity in regulating Internet accessibility, as if the courts did not have enough on their plates already! This is a particular concern because it seems this will have far-reaching effects as this massive responsibility will only clog an already over burdened justice system. Additionally, if one were to factor in the increasingly partisan politics in our nations capital, the ideological differences among politicians has slowed the process of filling vacant federal court judgeship’s to a crawl. Yet another example of what may help to further strain the administration of justice across the country.

In conclusion, I must be fair in highlighting a major point of support I have towards the Appellate Court’s ruling against the FCC. From a strict business perspective, the claim by ISPs that it is neither fair nor reasonable for bandwidth hogging applications to be charged the exact same fees as normal bandwidth users is correct. If someone is using a program online that takes up a disproportionately large amount of bandwidth so that it infringes upon the Internet experience for other users, that one user should be charged more. In simplistic terms, this concept is similar to the way taxes are determined in the United States. The more income you make, the more you pay in taxes. Similarly, the more Internet bandwidth you use, the more you pay for that usage. Conceivably, the ISPs can reinvest the extra funding generated in this way for higher bandwidth capacity cables so that the Internet experience can be consistent and reliable for all users.

Stay tuned for future developments on net neutrality as the Appellate Court ruling is a beginning, not an end, to a significantly important issue facing the future of Internet use in the United States.

The premise behind Internet frameworks: open access or suffocating structure?

19 03 2010

When you search for material on the Internet, do you know how you come across the results that you view? Are there influences actively dictating what you view and what you do not have a chance to comprehend? In essence, these are some of the primary points focused on in Jonathan Zittrain’s book, The Future of the Internet.

Within this weighty yet interesting book Zittrain attempts to dissect the nature of the Internet’s DNA. Often times the Internet is though of being the ultimate example of pure openness, where anyone, anywhere can contribute anything for others to view, enjoy or revolt against. Specifically, Zittrain isolates the method for this occurrence into two distinct camps: generative nature and proprietary means.

The wider ranging of these two concepts is that of generative nature. Just as the name describes, “generativity considers how a system might grow or change over time as the uses of a technology by one group are shared with other individuals, thereby extending the generative platform (78).” The basic premise being that there is an active fostering of the possibility to produce unanticipated change through unfiltered contributions from broad and varied audiences.

Contrasting this approach is the proprietary model comprised of the tethered appliance. In its most basic form, the appliance being either an Internet service provider or a physical product that locks users into using the said product as the exclusive means of accessing what is desired. Freedom to roam, explore, contribute or change the basic dynamics is largely lacking. Innovation is the curse instead of the buzz within this framework.

Zittrain pairs both of these approaches to online existence as we know it currently and explores associated issues with each one. A common, yet incredibly fundamental component of both approaches are security. It does not take much effort to realize the potential security problems arising from vast, open electronic networks where anyone can contribute content regardless what the intent may be. This is exactly how computer viruses gained prominence during the fledgling days of the Internet, as there was an unprotected network of interconnected computers completely willing to accept malicious code of it were introduced. This lack of care for purpose was exactly the motivation of early Internet programmers, who could have cared less about how the Internet was used. Instead, they only cared that the Internet was functional (28). Zittrain reflected upon these risks associated with this approach, “The most salient feature of a PC is its openness to new functionality with minimal gate keeping. This is also its greatest danger (57).”

Response to such vulnerabilities inherent from complete openness has been proprietary measures to provide great control to allow measured functionality. The upside to this approach is that greater security can result. However, the downside is limited accessibility to content and perspective. In an online environment of the twenty-first century, this is an extremely serious constraint to content with. Even if malicious actions can be reduced via the increase in tethered appliances among consumer markets to help ensure more stable performance, an entirely new risk emerges. Not just is content access controlled by a centralized authority but Those who control the tethered appliances can control the behavior undertaken with the advice in a number of ways: preemption, specific injunction, and surveillance (107).”

One thing is pretty clear; the Internet is here to stay. It is not going to disappear anytime soon. In turn, the continued functionality of Internetusability requires a balancing act when it comes to security by way of open platforms and securely controlled access. Over extending in either direction will only lead to revolt and possible disruption. A balanced approach is needed where security plays a distinct motivating role but open accessibility, determined by individual users, is weighted just as heavily. Zittrain suggests that a keen balance is necessary where regulation from within, among users, can actively assuage security fears if given the chance to work (102-103). Furthering this approach, he references the online encyclopedia Wikipedia as one example where users act as collective bastions in protecting the platform while still maintaining open access to almost anyone to view or contribute. Although I personally believe it is too early to know with certainty if a mainstream ethos of established boundaries is sweeping the Internet, I do think that lessons can be learned and applied to some degree from Wikipedia. The question remains, when will we know and will you be able to tell?