Network Management and the Open Internet

Here’s the video from the Arts + Labs event at George Washington U on Oct. 29th. There’s a lot of back-and-forth since this was a diverse panel. The second panel begins about halfway in. Enjoy.

New Media, New Networks Presented by Arts + Labs and GSPM’s Institute for Politics, Democracy and the Internet from GSPM on Vimeo.

What Will the Internet of the Future Look Like?

If you’re in Washington, stop by the Cannon Bldg. next Monday for a discussion of the Internet of the Future:

Internet regulations pending in the United States can either facilitate or impede Internet evolution depending on detailed definitions of packet discrimination, traffic shaping, network management, and carrier business models. A panel of distinguished engineers share their views on the changes that must be made to Internet service to support such applications as pervasive networking, video-conferencing, immersive gaming, telemedicine, and the Internet of Things. Join us for a discussion of the tension between regulation and innovation in the Internet context.

Date: Monday, November 2, 2009
Time: 10:00 AM – 11:30 AM
Location: Cannon Office Building, Room 121

* Richard Bennett
Research Fellow, Information Technology and Innovation Foundation; co-inventor of the Ethernet hub and elements of the Wi-Fi protocols

* Dr. David Farber
Distinguished Professor of Computer Science and Public Policy at the School of Computer Science, Heinz College at Carnegie Mellon University

* Dr. Charles Jackson
Adjunct Professor of Electrical Engineering, George Washington University; formerly with the FCC and House Communications Subcommittee

* Dr. Jon Peha
Chief Technologist, Federal Communications Commission and Professor of Computer Science at Carnegie Mellon University

The issues we’re going to discuss will include changes to Internet architecture and operation to support new apps and the impact, positive and negative, of pending regulations.

, ,

Harold Feld’s Response to “Designed for Change”

Public Knowledge’s legal director, Harold Feld, has posted an interesting response to my paper, Designed for Change: End to End Arguments, Internet Innovation, and the Net Neutrality Debate on his personal blog, Harold Feld’s Tales of the Sausage Factory. This isn’t PK’s official response, in other words.

Harold grasps my argument tolerably well, which is kind of gratifying as the paper is more technical than the typical policy tome:

Bennett’s essential argument, if I grasp it correctly, is that certain difficulties most agree are substantial problems would be far easier to solve if we gave the network operators greater freedom to manipulate traffic. While possibly true in the abstract, I am much less convinced it will play out that way in reality. For one thing, when Comcast was forced to disclose its network management practices, it turned out that Comcast was not actually experiencing significant network congestion. Instead, it was proactively solving the fear of future network congestion by going after the top 1000 users every month and targeting what it considered the most significant applications that could cause congestion in the future. That had the virtue of cheap efficiency for Comcast, but it had significant costs to others.

Here’s my response:

Thanks for the write-up Harold, you seem to grasp the points I tried to make in the paper extremely well. I’m trying to add some technical depth to the net neutrality discussion, not necessarily answer all the questions. And I do say in the paper that the NN debate encompasses a number of issues about equities and ownership that are far outside the skill set of engineers. I’m urging caution with respect to being too eager to offer regulatory prescriptions that aggravate the emerging technical issues. While the Internet is 35 years old, we’re facing some issues today that have never been faced before, so in some respects it might as well be only a few months old. There are increasingly diverse uses of the Internet today in terms of applications, and an increasingly diverse user population than ever before. So some regulatory frameworks that seemed sensible in the past may not have great utility in the future, and could have the effect of limiting utility of the Internet as an engine of free speech and political organizing.

We already have a situation on our hands where effective video distribution requires the use of a CDN like Akamai or YouTube, and even YouTube doesn’t deliver consistently good streaming quality. There are underlying technical issues that cause this to be case, and we can’t resolve them merely by clamping down on the ISPs. Developing sensible two-way communication between national telecom regulators such as the FCC and its counterparts and the IETF may help move the ball down the road. Adding services to the network core in a principled and sound way should actually increase the value of the Internet for users as well as operators.

, , ,

Designed for Change

I released a report on the Internet’s technical history Friday at the ITIF World Headquarters in Washington, DC. Thanks to a great turnout and a fantastic panel of very smart people commenting on the paper, we kicked off a lively new thread in the Net Neutrality discussion, so the launch was a success.

FCC Chairman Genachowski helped create interest in the report by unveiling his net neutrality program earlier in the week. I take issue with the Chairman’s notion that the Internet is fair to all applications; explaining why leads to a discussion of the weakness of the net neutrality arguments and the need to keep innovation alive in the network itself. You can download the report and get a video of the discussion by clicking this link to the ITIF’s web site. Enjoy.

, ,

Net Neutrality Regulations Coming

In FCC Chairman Genachowski’s long-anticipated statement on net neutrality rulemaking today, the Chairman made the claim that the Internet architecture is both unbiased and future-proof. However, as ITIF notes in a forthcoming report, “Designed for Change: End-to-End Arguments, Internet Innovation, and the Net Neutrality Debate”, the Internet’s architecture doesn’t make it future-proof, the process of experimentation and continual improvement does; rule making can seriously jeopardize Internet flexibility unless it’s undertaken with great care. In addition, it’s important to note that the Internet has always preferred some applications over others; it favors content over communication, for example. Network management is necessary as a means to overcome the Internet’s structural bias, so strict rules limiting network management to the mitigation of spam, malware, and attacks are not good enough. Carriers must be empowered to enable communications applications to compete equitably with content applications; only the carriers can provide fair access to diverse applications and users.

The approach to Internet regulation that focuses exclusively on the rights of consumers and the responsibilities of carriers belies the fact that the Internet invests substantial network control at the intelligent edge; the Internet gives each of us the power to be a producer as well as a consumer, and with that power comes responsibility. We can innovate without permission, but we all have to behave responsibly. It goes without saying that open access networks are desirable, so the real test of the FCC’s rulemaking will come from its assessment of both user behavior and operator management practices. We have every confidence that the Commission will undertake a serious, rigorous and fact-based rule making. The Internet enables innovation to the extent that carriers provide robust and reliable transport services to applications; if this capability is preserved and enhanced by a sensible network management framework, innovation will win.

New Broadband Czar

Trusted sources tell me Blair Levin is headed back to the FCC to be the Commissar of the People’s Glorious Five Year Plan for the Production of Bandwidth. He’d be a wonderful choice, of course, because he’s a bright and humorous fellow with no particular delusions about what he knows and what he doesn’t know.

I haven’t been enthusiastic about this National Broadband Plan business myself, but if we’re going to have one, we’re going to have one, and it should be the best one on the planet. And no, that doesn’t mean that the object of the exercise is for America’s broadband users to have big foam number 1 fingers, it means we do something sensible with the people’s tax dollars.

The plan should figure out a meaningful way to measure progress, and it should fund some of the efforts to create the next-generation network that will one day supersede the TCP/IP Internet. We all love TCP/IP, mind you, but it’s a 35-year-old solution to a problem we understand a lot better today than we did in 1974. We’ll get a chance to see just how much vision the New FCC has by their reaction to this proposal.

UPDATE: Press reports are dribbling out about the appointment.

Finally, nominees for the FCC

Amy Schatz of the WSJ reports that a deal has been struck to move the new nominees into the FCC:

Work has slowed to a crawl at the Federal Communications Commission, since President Barack Obama’s pick to be chairman, Julius Genachowski, is still awaiting Senate confirmation.

But the logjam could be broken soon: Republicans appear to have settled on two people to fill the GOP seats on the five-member board, paving the way for a confirmation hearing in June. Senate Republicans have agreed on former Commerce Department official Meredith Attwell Baker and current FCC Commissioner Robert McDowell, officials close to the process say.

This is good news. McDowell has been the best of the FCC commissioners since his appointment, and allowing him a second term is a very bright move. Uncertainty over McDowell’s future was the cause of the slowdown in confirmation hearings, since these things go forward with the whole slate of nominees. So the new FCC is going to look this this:

Chairman Genachowski, new blood
Dem Copps, old hand
Dem Mignon Clyburn, new blood
Rep McDowell
Rep Meredith Baker, new blood

It’s interesting that Baker and Clyburn are both nepotism candidates, as Clyburn is the daughter of powerful Congressman James Clyburn and Baker is the daughter-in-law of the Bush family’s consigliere, James Baker. That’s not necessarily a bad thing, as the best Chairman of recent times was Colin Powell’s son, and neither of the daughters is particularly unqualified. But if you want to get a laugh out of Blair Levin, the former “sixth commissioner” who wasn’t nominated, tell him you understand that he’s not qualified to serve on the FCC because his daddy’s not in politics. You won’t get a laugh exactly, more like a moan.

The first item of business for the nominees, once they’re confirmed, will be the list of 120 questions Copps put to the world. Good luck to the Commission with that.

FCC Comments due in National Broadband Plan

See IEEE Spectrum for a few observations on the FCC’s request for comments on the National Broadband Plan:

Comments are due Monday, June 8, at the FCC on the National Broadband Plan (NBP.) The Notice of Inquiry lists some 120 questions that the Commission would like filers to address, running the gamut from goals and benchmarks to open access to privacy to entrepreneerial activity to job creation. Anyone who compiles a list of so many questions clearly hasn’t given much thought to the problem under discussion, so it’s clear upon reading the NOI that we’re many years away from a good NBP, although we may have some vague and probably counter-productive guidelines much sooner: the FCC is supposed to report a plan to Congress by next February. Bear in mind that it took the US 20 years to convert from analog to digital TV, and we’re not even there yet.

There’s more.

At long last, Genachowski

The long-awaited nomination of Julius Genachowski to the FCC chair finally came to pass yesterday, raising questions about the delay. If everybody with an interest in telecom and Internet regulation knew he was the choice months ago, why did the official announcement take so long? I have no inside information, so I’ll leave it to those who do to enlighten us on that question. Perhaps the Administration was just being extra-cautious after the debacles around a Commerce Secretary and others.

Neutralists are excited about the choice, naturally, as they view Genachowski as one of their own. And indeed, if network neutrality were actually a coherent policy and not just a rag-tag collection of Christmas wishes, they would have cause to be exhilarated. But given the range of restrictions that the movement seeks, it’s less than clear that any particular raft of regulations would satisfy them and leave broadband networks the ability to function, so we’ll see how this pans out. We’re already hearing runblings from Boucher that there may not be any Congressional action on network neutrality this year in any case.

Genachowski brings an interesting (and potentially very dangerous) set of qualifications to the job. A college buddy of the President, he’s an inner circle member with the power to wield enormous influence. As a former FCC staffer, he’s imbued with the Agency’s culture, and as a former venture capitalist funding fluffy applications software, he’s something of a tech buff. But he resembles Kevin Martin in most of the important respects: he’s a Harvard lawyer who’s worked inside the regulatory system for most of his life, and he has strong alliances to an industry that seeks to exercise control over the nation’s network infrastructure for its own purposes. Whether those purposes resemble the public interest remains to be seen.

The largest problem with the FCC and similar agencies is the knowledge gap between regulators and the modern broadband networks that are the subject of their regulatory power. Martin didn’t have the training to appreciate the effect that his orders would have on the infrastructure, and neither does Genachowski. So the new Chairman is just as likely as the old chairman to make things worse while trying to make them better.

In a perfect world, the commissioners would be able to rely on the expert judgment of the Chief Technologist to stay out of trouble, but the current occupant of that job, Jon Peha, has a penchant for playing politics that renders him ineffective. The bizarre, quixotic inquiry the FCC made recently into the quality of service variations between Comcast’s voice service and over-the-top VoIP is an example. This isn’t a serious line of inquiry for a serious Commission, and Peha never should have let it happen. But it did, and that fact should remind us that the FCC is more a creature of politics than of technology.

DTV Transition Starts, World Doesn’t End

Contrary to the expectations of Congress and the FCC, the first phase of the DTV transition took place without major incident. Some 23% of American TV stations stopped sending out analog signals Tuesday at midnight, and only 28,000 calls came into the centers the FCC and the cable and satellite providers have established for transition help. The biggest category of call, close to half of all calls, was from people unable to pick up the digital broadcasts at all, or picking them up with very poor quality. A significant number didn’t know how to setup their converter boxes, or didn’t realize that the converter boxes have to scan for channels.

These numbers support a suspicion I’ve had for a while now, that the emphasis on converter boxes is misplaced. The problem that most people are going to have is a complete inability to receive digital broadcasts at all, because they don’t have the right kind of antenna, the antenna isn’t oriented properly, or because they live in the wrong place. Many stations are moving transmitter locations to alter service areas, and won’t be serving some traditional customers any more. Others are reducing power, sometimes quite substantially. Digital broadcasts are more robust, so some reduction in power is quite sensible. But I suspect that over-the-air delivery of TV is such a small percentage of the overall market – well below 20%, and in some areas less than 10% – that it doesn’t make financial sense for stations to invest heavily in high power transmitters.

The timing of the transition was very bad for this reason. A substantial number of OTA TV viewers are doing to need upgrades to roof-mounted antennas, and in many cases they’re going to need multiple antennas pointing in different directions. Getting up on a roof in February is not a pleasant experience in much of America, so a May or June transition date would have been much more sensible. In any event, it’s a good time to buy stock in antenna companies.

I’ve been doing some experiments with roof-mounted antennas that I’ll be reporting on shortly. So far, I can only get 5 stations where I live, and four broadcast in Spanish. Perhaps the FCC needs a budget for bilingual education as well as for converter boxes and antennas.