
Neutrality Or Bust
Access to fast, affordable and open broadband, for users and developers alike is, I believe, the single most important driver of research in our business. The FCC will likely vote straightway week on a framework for net neutrality-we got aspects of this wrong ten years ago, we can't afford to be wrong again. For the reasons I outline below, we are at an important juncture in the evolution of how we connect to the Internet and how services are delivered on top of the platform. The lack of basic "rules of the road" for what network providers and others can and can’t do is starting to hamper research and growth. The proposals aren’t perfect however now is the time for the FCC to act.
1. The Internet and how we build things on the network is undergoing meaningful change as we transition to broadband and wireless access.
Network providers are making significant capital commitments that will shape access to networks in coming years. In spite of this, the US is behind in both broadband and wireless connectivity. Only 65% of American households have broadband access, compared to 90% of households in South Korea. It is important to note that not all access is created equal. A study from previously this year puts the US in 18th place with an average of 3.8 Mbps downstream compared to an average of 14.6Mbps in South Korea. The US is but 22nd in terms of downstream broadband speed, behind Latvia and the Czech Republic. The story is the same on a price per megabit basis: in the US, we pay $40 per month for an average of 3.9Mbps, which can be compared to a $45 per month fee that includes 20-30Mbps connections in France(plus VoIP (Voice over Internet Protocol) service and HDTV + DVR just in case).
As I said at the start, access to fast, affordable broadband for users and developers is, I believe, the single most important driver of technology in our market. We got this wrong ten years ago-we don't have a competitive market for broadband today, access is inconsistent, prices are high and speeds are often anemic-and we can't afford to be wrong again. The structural separation approach that the Europeans took a decade ago yielded cheap, fast access in their market. I believe this access has been the most significant factor in the advancement of European Internet technology. In spite of this, the European approach is nevertheless reaching its limits. The transition to wireless Internet access provides an possibility; and as the network becomes more diverse, the need for common technical standards becomes essential. An uneven experience across various platforms will fragment technology and promote gatekeepers' ability to tax applications. Match this situation with the embedded conflicts of interest in the delivery of video over DOCIS, or wireless vs. over-the-top IPTV, and you get a sense of the network complexities at hand. As Chairman Genachowski pointed out, we need "rules of the road" and however is the time to act.
Non-discriminatory pricing of bits and the clear definition of layers that make up the Internet stack are two of the key architectural foundations of the network. The fact that bits containing applications, images, text or videos are handled in the same manner is central to how the Internet works. Network providers can shape or manage traffic on an aggregate, best-effort basis nevertheless identifying a single application or any content in an application or page will change the way the network is used. Expressly, it will hamper research by end-users just as individuals, developers and new or existing companies. Similarly, the layers are building blocks that are vital to how we develop and build Internet companies. This goes back to seminal pieces of Internet literature like the rise of the stupid network. I agree that, as soon as possible, tightly coupled systems can provide more efficient means to drive end-to-end research when you know precisely what you want to build. Yet I fundamentally believe that the essence of technology is that you don't normally know specifically what you want to build.
Innovators aim to solve problems-they start in one place and at the time they iterate. All too often real technology is simply stumbled upon. Ideas and companies evolve as they better understand the problem they are seeking to solve. The Internet has demonstrated time and time again that loosely coupled systems and edge-based research is what drives the kind of massive change we have seen over the past two decades. This freedom to create "on the edge", and to evolve ideas, is what gets me up in the morning and keeps me up late at night.
Like all good architecture, structural principles are remarkably resilient to change and scale. There have been continual challenges to these principles over the past few decades nevertheless this has all been part of the persistent tension that exists in a network between centralization and decentralization. Today, given our current transition to wireless and broadband access, the challenges faced are more fundamental as network providers attempt to change these building blocks as preconditions to future investment. The conflation of access with control of the stack of the open Internet is wrong.
The edge-based technology I talk of is predicated on access to a handful of things and the persistent tension between centralization and decentralization is a hallmark of a healthy web, evident in debates all the way back to Napster, CompuServe and AOL and, more recently, Facebook and Wikileaks. We have many native Internet companies relative to ten years ago. Although these native Internet companies come from the edge, no single company represents the edge. In addition, as companies scale, they become increasingly misaligned with the edge. Google, Amazon, Facebook, eBay, and Yahoo, for instance, all came from edge-based innovation however no longer represent the edge. In spite of intentions to the contrary, there is a natural evolutionary path through which a large company becomes less likely to let edge-based innovations flourish and more likely to preserve the status quo. There is currently an over-representation of the center in Washington DC and the edge needs a louder voice. That's up to us and, most likely, as well up to you.
Burnham advocates Barbara van Schewick's approach to "all application-specific discrimination". I believe this approach can work because it works today. It is hard to understand where to draw lines here now we know what we think when a network provider discriminates against a specific application or specific content. We know it when we see it. Schewick proposes a generalized rule to ensure that this discrimination does not happen. If you doubt this approach, read the Zedevia letter as evidence that companies hesitate to invest without clarity-companies need evidently drawn lines. How much edge-based telephony technology have you seen on the iPhone? Not a lot. Today-the list of issues and examples of discrimination is starting to grow. This is happening as the adoption of over the top services places pressure on the cable companies’ video based earnings or the wireless companies’ voice and data earnings. Application-agnostic network management with a definition of an application should include apps, sites and web services. To the extent that there are specialized services that network providers want to put in market they should do that-yet they need to be distinguished from the open internet.
The arguments that wireless should be treated separately from wireline are in my mind specious in the best case. In spite of the fact that wireless network providers manage the network differently than wireline providers, wireless providers, like wireline providers, should not have the ability to discriminate against specific content, sites or applications.
Furthermore application developers need uniformity of standards at the lower levels of the stack to be able to build products and services in a seamless manner at higher levels of the stack. For instance, we are currently building a social reading service that will ship as an iPad application. It includes an interface that distills content streams that should be of interest to you, the reader. The content is at that time displayed inline, regardless of whether it is text, images or videos. Imagine you use this iPad application at home on your home network. All images, text, and videos are displayed and usable. Nevertheless imagine that you take your iPad to the park and fire up the same application through a 3G or 4G wireless connection and all of a sudden the videos won't work? Not that they are slow-they just wont work given the plan you are on.
The Microsoft Antitrust trial
Since my work years ago on the Microsoft Antitrust trial, I have been an adamant believer in minimizing the role of government as it relates to research policy. Everything considered, if government has a role in innovation policy it is right here. Our business at betaworks is predicated on a thriving market for early-stage tech technology at the content and application layer. Most of the businesses we have built or funded would not exist without the assumed freedoms that formed the platform we call the Internet.
- · Rackspace debuts OpenStack cloud servers
- · America's broadband adoption challenges
- · EPAM Systems Leverages the Cloud to Enhance Its Global Delivery Model With Nimbula Director
- · Telcom & Data intros emergency VOIP phones
- · Lorton Data Announces Partnership with Krengeltech Through A-Qua⢠Integration into DocuMailer
