FCC Broadband Deployment Research workshop

The long-awaited video of the FCC’s December 10 workshop Review and Discussion of Broadband Deployment Research is finally on-line. This workshop featured discussions of Yochai Benkler’s controversial Berkman Center report on unbundled DSL and Bob Atkinson’s report on current broadband investment dynamics in the US. As the FCC put it:

As part of the Commission’s development of the National Broadband Plan, the Commission has requested two independent studies. The Commission asked Harvard University’s Berkman Center for Internet and Society to conduct an expert review of existing literature and studies about broadband deployment and usage throughout the world. The Columbia Institute for Tele-Information (“CITI”), based at the Columbia Business School in New York, conducted an independent outside expert review of projected deployment of new and upgraded broadband networks.

Benkler’s report was very politely decimated by Tom Hazlett, an actual economist who knows a thing or two about how Benkler cooked the books, intentionally or by bungling, and the relevant comparisons for the US. One of the many problems with Benkler’s report is that he didn’t do what the FCC asked him to do, which was to simply review the literature on international policies. Instead, he and his Berkman colleagues tried to aggregate all the data into a giant meta-study. Benkler violated the FCC’s “no original research” rule, which should have been familiar to Benkler given his fascination with Wikipedia.

, ,

Steal These Policies

ITIF released a report today on digital piracy, Steal These Policies: Strategies for Reducing Digital Piracy co-authored by Dan Castro, Scott Andes, and yours truly. Here’s the blurb:

It is time for the U.S. government to take global theft of U.S. intellectual property, especially digital content, much more seriously. A new ITIF report finds that the U.S. government can and should do more to support industry efforts to reduce digital piracy, a growing problem that threatens not only the robust production of digital content, but U.S. jobs. While there are no “silver bullets” to reducing digital piracy, there are a number of “lead bullets” that can and should be implemented. Specifically, ITIF calls on the federal government to not preclude those impacted by digital piracy, including copyright holders and ISPs, from taking steps, including implementing technical controls like digital fingerprinting, to reduce piracy. In addition, industry and government should consider bold steps to limit the revenue streams of those profiting from piracy by encouraging ISPs, search engines, ad networks and credit card companies to block piracy websites and refuse to do business with them. These options should be part of a broad dialogue that engages all stakeholders, including government, content owners, website operators, technology developers, and ISPs and other intermediaries, on how to improve the global response to piracy. Toward that end, this report recommends that policymakers:

And here’s the video of the launch event:

One point that comes across better from the live event than from the paper is that piracy isn’t simply something that takes place between close personal friends, it’s a business that profits from the unauthorized sale of other people’s material. Whatever your views on Internet privacy and intellectual property rights may be, I think we can all agree that the business of piracy is wrong.

, ,

The Hippie who Hooked-up South Africa

Have you ever wondered how South Africa got connected to the Internet? It happened during the bleak days of apartheid, thanks to the valiant efforts of self-proclaimed hippie Randy Bush:

I suppose you are wondering what a computer scientist, engineer, and unrepentant hippie is doing at this lectern today. Well, I am also wondering the same. So I guess the best I can do with this honor and opportunity is to tell you about why I chose to do certain things and the small but occasionally pungent lessons I have taken away from these experiences.

Not everyone was willing to break the boycott in those days, but Bush had his reasons:

Well, I had been raised to boycott all dealings with South Africa, as well as Franco’s Spain, Salazar’s Portugal, and other international pariah states. And I was being asked to directly support South Africa’s entry into the internet. Serious soul-searching led me to the conclusion that social change was not likely to be accomplished by cutting off communication. So I agreed on the condition that connectivity would be for universities and NGOs only, and only those which were not apartheid-supporting or enforcing. The administrative work and funding from the South African side was done by Vic Shaw of the FRD. In November 1991, a bit over ten years ago, the first direct full internet connectivity to South Africa (as opposed to store and forward email) was commissioned via a low speed leased line to my home office in the States. South Africa was the second country in Africa to become connected to the internet, preceded by Tunisia a few months earlier.

That’s quite an interesting legacy. Currently, Bush works for the Japanese government and as a volunteer with various non-profits.

UPDATE: Reader Andrew Alston says the credit doesn’t properly fall on Bush:

Randy Bush might have been involved, but he is DEFINITELY not the father of the South African internet, if that title goes to anyone its Mike Lauwrie from back in the Rhodes University days.

There you are, two points of view from which to choose.

Guest Blog at GigaOm

My guest blog at GigaOm deals with paid peering and the net neutrality regulations, How Video Is Changing the Internet:

But paid peering may be forbidden by Question 106 of the FCC’s proposed Open Internet rules because it’s essentially two-tiered network access, Norton points out.

Paid peering illustrates how hard it is to write an anti-discrimination rule for the Internet that doesn’t have harmful side effects for all but the largest content networks. Paid peering is a better level of access to an ISP’s customers for a fee, but the fee is less than the price of generic access to the ISP via a transit network. The practice of paid peering also reduces the load on the Internet core, so what’s not to like? Paid peering agreements should be offered for sale on a non-discriminatory basis, but they certainly shouldn’t be banned.

There’s another good treatment of the subject at Digital Society, inspired by the same conversation with peering maven Bill Norton.

UPDATE: There’s an incredible whine-a-thon in the comments to this article by Google’s Director of Network Operations Vijay Gill and some of his friends from a network operators’ IRC channel. Gill says I’ve got all the facts wrong because paid peering existed in a very limited way ten years ago under a different name. I don’t dispute that, but simply note its potential problems with net neutrality regulations in some guises. The issue is whether the Internet of the Future will be a slave to the Internet of the Past’s supposed insistence on a single service level for all peering agreements, not that there ever has been such a regulation.

UPDATE 2: One thing I definitely was unclear about is whether Arbor’s estimates of traffic growth, 47%, are in line with the MINTS estimates. I conclude that overall growth is much higher than the MINTS figure because Arbor measures only inter-domain traffic at Internet Exchanges. There’s obviously been a great deal of growth in the Akamai and Limelight CDNs, neither of which is measured by MINTS or Arbor, and growth in private peering (paid and unpaid) as well. MINTS measures more than public IX traffic, yet their figures are in line with Arbor’s data from public sources only; this difference in method and similarity of measurements suggests that MINTS may be understating the total of the inter-domain case, depending on how the load falls out between public and private sources. Private connections are increasing, according to IX operators and heavy users.

, , , ,

What Will the Internet of the Future Look Like?

If you’re in Washington, stop by the Cannon Bldg. next Monday for a discussion of the Internet of the Future:

Internet regulations pending in the United States can either facilitate or impede Internet evolution depending on detailed definitions of packet discrimination, traffic shaping, network management, and carrier business models. A panel of distinguished engineers share their views on the changes that must be made to Internet service to support such applications as pervasive networking, video-conferencing, immersive gaming, telemedicine, and the Internet of Things. Join us for a discussion of the tension between regulation and innovation in the Internet context.

Date: Monday, November 2, 2009
Time: 10:00 AM – 11:30 AM
Location: Cannon Office Building, Room 121

* Richard Bennett
Research Fellow, Information Technology and Innovation Foundation; co-inventor of the Ethernet hub and elements of the Wi-Fi protocols

* Dr. David Farber
Distinguished Professor of Computer Science and Public Policy at the School of Computer Science, Heinz College at Carnegie Mellon University

* Dr. Charles Jackson
Adjunct Professor of Electrical Engineering, George Washington University; formerly with the FCC and House Communications Subcommittee

* Dr. Jon Peha
Chief Technologist, Federal Communications Commission and Professor of Computer Science at Carnegie Mellon University

The issues we’re going to discuss will include changes to Internet architecture and operation to support new apps and the impact, positive and negative, of pending regulations.

, ,

Harold Feld’s Response to “Designed for Change”

Public Knowledge’s legal director, Harold Feld, has posted an interesting response to my paper, Designed for Change: End to End Arguments, Internet Innovation, and the Net Neutrality Debate on his personal blog, Harold Feld’s Tales of the Sausage Factory. This isn’t PK’s official response, in other words.

Harold grasps my argument tolerably well, which is kind of gratifying as the paper is more technical than the typical policy tome:

Bennett’s essential argument, if I grasp it correctly, is that certain difficulties most agree are substantial problems would be far easier to solve if we gave the network operators greater freedom to manipulate traffic. While possibly true in the abstract, I am much less convinced it will play out that way in reality. For one thing, when Comcast was forced to disclose its network management practices, it turned out that Comcast was not actually experiencing significant network congestion. Instead, it was proactively solving the fear of future network congestion by going after the top 1000 users every month and targeting what it considered the most significant applications that could cause congestion in the future. That had the virtue of cheap efficiency for Comcast, but it had significant costs to others.

Here’s my response:

Thanks for the write-up Harold, you seem to grasp the points I tried to make in the paper extremely well. I’m trying to add some technical depth to the net neutrality discussion, not necessarily answer all the questions. And I do say in the paper that the NN debate encompasses a number of issues about equities and ownership that are far outside the skill set of engineers. I’m urging caution with respect to being too eager to offer regulatory prescriptions that aggravate the emerging technical issues. While the Internet is 35 years old, we’re facing some issues today that have never been faced before, so in some respects it might as well be only a few months old. There are increasingly diverse uses of the Internet today in terms of applications, and an increasingly diverse user population than ever before. So some regulatory frameworks that seemed sensible in the past may not have great utility in the future, and could have the effect of limiting utility of the Internet as an engine of free speech and political organizing.

We already have a situation on our hands where effective video distribution requires the use of a CDN like Akamai or YouTube, and even YouTube doesn’t deliver consistently good streaming quality. There are underlying technical issues that cause this to be case, and we can’t resolve them merely by clamping down on the ISPs. Developing sensible two-way communication between national telecom regulators such as the FCC and its counterparts and the IETF may help move the ball down the road. Adding services to the network core in a principled and sound way should actually increase the value of the Internet for users as well as operators.

, , ,

Designed for Change

I released a report on the Internet’s technical history Friday at the ITIF World Headquarters in Washington, DC. Thanks to a great turnout and a fantastic panel of very smart people commenting on the paper, we kicked off a lively new thread in the Net Neutrality discussion, so the launch was a success.

FCC Chairman Genachowski helped create interest in the report by unveiling his net neutrality program earlier in the week. I take issue with the Chairman’s notion that the Internet is fair to all applications; explaining why leads to a discussion of the weakness of the net neutrality arguments and the need to keep innovation alive in the network itself. You can download the report and get a video of the discussion by clicking this link to the ITIF’s web site. Enjoy.

, ,