Here’s the video from the Arts + Labs event at George Washington U on Oct. 29th. There’s a lot of back-and-forth since this was a diverse panel. The second panel begins about halfway in. Enjoy.
Here’s the video of our Capitol Hill discussion of the Internet of the Future.
Public Knowledge’s legal director, Harold Feld, has posted an interesting response to my paper, Designed for Change: End to End Arguments, Internet Innovation, and the Net Neutrality Debate on his personal blog, Harold Feld’s Tales of the Sausage Factory. This isn’t PK’s official response, in other words.
Harold grasps my argument tolerably well, which is kind of gratifying as the paper is more technical than the typical policy tome:
Bennett’s essential argument, if I grasp it correctly, is that certain difficulties most agree are substantial problems would be far easier to solve if we gave the network operators greater freedom to manipulate traffic. While possibly true in the abstract, I am much less convinced it will play out that way in reality. For one thing, when Comcast was forced to disclose its network management practices, it turned out that Comcast was not actually experiencing significant network congestion. Instead, it was proactively solving the fear of future network congestion by going after the top 1000 users every month and targeting what it considered the most significant applications that could cause congestion in the future. That had the virtue of cheap efficiency for Comcast, but it had significant costs to others.
Here’s my response:
Thanks for the write-up Harold, you seem to grasp the points I tried to make in the paper extremely well. I’m trying to add some technical depth to the net neutrality discussion, not necessarily answer all the questions. And I do say in the paper that the NN debate encompasses a number of issues about equities and ownership that are far outside the skill set of engineers. I’m urging caution with respect to being too eager to offer regulatory prescriptions that aggravate the emerging technical issues. While the Internet is 35 years old, we’re facing some issues today that have never been faced before, so in some respects it might as well be only a few months old. There are increasingly diverse uses of the Internet today in terms of applications, and an increasingly diverse user population than ever before. So some regulatory frameworks that seemed sensible in the past may not have great utility in the future, and could have the effect of limiting utility of the Internet as an engine of free speech and political organizing.
We already have a situation on our hands where effective video distribution requires the use of a CDN like Akamai or YouTube, and even YouTube doesn’t deliver consistently good streaming quality. There are underlying technical issues that cause this to be case, and we can’t resolve them merely by clamping down on the ISPs. Developing sensible two-way communication between national telecom regulators such as the FCC and its counterparts and the IETF may help move the ball down the road. Adding services to the network core in a principled and sound way should actually increase the value of the Internet for users as well as operators.
In FCC Chairman Genachowski’s long-anticipated statement on net neutrality rulemaking today, the Chairman made the claim that the Internet architecture is both unbiased and future-proof. However, as ITIF notes in a forthcoming report, “Designed for Change: End-to-End Arguments, Internet Innovation, and the Net Neutrality Debate”, the Internet’s architecture doesn’t make it future-proof, the process of experimentation and continual improvement does; rule making can seriously jeopardize Internet flexibility unless it’s undertaken with great care. In addition, it’s important to note that the Internet has always preferred some applications over others; it favors content over communication, for example. Network management is necessary as a means to overcome the Internet’s structural bias, so strict rules limiting network management to the mitigation of spam, malware, and attacks are not good enough. Carriers must be empowered to enable communications applications to compete equitably with content applications; only the carriers can provide fair access to diverse applications and users.
The approach to Internet regulation that focuses exclusively on the rights of consumers and the responsibilities of carriers belies the fact that the Internet invests substantial network control at the intelligent edge; the Internet gives each of us the power to be a producer as well as a consumer, and with that power comes responsibility. We can innovate without permission, but we all have to behave responsibly. It goes without saying that open access networks are desirable, so the real test of the FCC’s rulemaking will come from its assessment of both user behavior and operator management practices. We have every confidence that the Commission will undertake a serious, rigorous and fact-based rule making. The Internet enables innovation to the extent that carriers provide robust and reliable transport services to applications; if this capability is preserved and enhanced by a sensible network management framework, innovation will win.
San Jose Mercury News columnist Troy Wolverton engaged in a bit of nostalgia in Friday’s paper. He pines for the Golden Age of dial-up Internet access, when Internet users had a plethora of choices:
A decade ago, when dial-up Internet access was the norm, you could choose from dozens of providers. With so many rivals, you could find Internet access at a reasonable price all by itself, without having to buy a bundle of other services with it.
There was competition because regulators forced the local phone giants to allow such services on their networks. But regulators backed away from open-access rules as the broadband era got under way. While local phone and cable companies could permit other companies to use their networks to offer competing services, regulators didn’t require them to do so and cable providers typically didn’t.
Wolverton’s chief complaint is that the DSL service he buys from Earthlink is slow and unreliable. He acknowledges that he could get cheaper service from AT&T and faster service from Comcast, but doesn’t choose to switch because he doesn’t want to “pay through the nose.”
The trouble with nostalgia is that the past never really was as rosy as we tend remember it, and the present is rarely as bad as it appears through the lens of imagination. Let’s consider the facts.
Back in the dial-up days, there were no more than three first-class ISPs in the Bay Area: Best Internet, Netcom, and Rahul. They charged $25-30/month, over the $15-20 we also paid for a phone line dedicated to Internet access; we didn’t want our friends to get a busy signal when we were on-line. So we paid roughly $45/month to access the Internet at 40 Kb/s download and 14 Kb/s or so upstream.
Now that the nirvana of dial-up competition (read: several companies selling Twinkies and nobody selling steak) has ended, what can we get for $45/month? One choice in the Bay Area is Comcast, who will gladly provide you with a 15 Mb/s service for a bit less than $45 ($42.95 after the promotion ends,) or a 20 Mb/s service for a bit more, $52.95. If this is “paying through the nose,” then what were we doing when we paid the same prices for 400 times less performance back in the Golden Age? And if you don’t want or need this much speed, you can get reasonable DSL-class service from a number of ISPs that’s 40 times faster and roughly half the price of dial-up.
Wolverton’s column is making the rounds of the Internet mailing lists and blogs where broadband service is discussed, to mixed reviews. Selective memory fails to provide a sound basis for broadband policy, and that’s really all that Wolverton provides.
From House Energy and Commerce:
Energy and Commerce Subcommittee Hearing on â€œBehavioral Advertising: Industry Practices and Consumersâ€™ Expectationsâ€
June 16, 2009
The Subcommittee on Communications, Technology and the Internet and the Subcommittee on Commerce, Trade, and Consumer Protection will hold a joint hearing titled, “Behavioral Advertising: Industry Practices and Consumers’ Expectations” on Thursday, June 18, 2009, in 2123 Rayburn House Office Building. The hearing will examine the potential privacy implications of behavioral advertising.
* Jeffrey Chester, Executive Director, Center for Digital Democracy
* Scott Cleland, President, Precursor LLC
* Charles D. Curran, Executive Director, Network Advertising Initiative
* Christopher M. Kelly, Chief Privacy Officer, Facebook
* Edward W. Felten, Professor of Computer Science and Public Affairs, Princeton University
* Anne Toth, Vice President of Policy, Head of Privacy, Yahoo! Inc.
* Nicole Wong, Deputy General Counsel, Google Inc.
WHEN: 10:00 a.m. on Thursday, June 18
WHERE: 2123 Rayburn House Office Building
This is the second in a series of hearings on the subject of behavioral advertising. I’ll predict that the Democrats will praise Google, the Republicans will criticize them, and nobody will pay much notice to Yahoo.
I only know four of the six personally, I need to get out more.
Trusted sources tell me Blair Levin is headed back to the FCC to be the Commissar of the People’s Glorious Five Year Plan for the Production of Bandwidth. He’d be a wonderful choice, of course, because he’s a bright and humorous fellow with no particular delusions about what he knows and what he doesn’t know.
I haven’t been enthusiastic about this National Broadband Plan business myself, but if we’re going to have one, we’re going to have one, and it should be the best one on the planet. And no, that doesn’t mean that the object of the exercise is for America’s broadband users to have big foam number 1 fingers, it means we do something sensible with the people’s tax dollars.
The plan should figure out a meaningful way to measure progress, and it should fund some of the efforts to create the next-generation network that will one day supersede the TCP/IP Internet. We all love TCP/IP, mind you, but it’s a 35-year-old solution to a problem we understand a lot better today than we did in 1974. We’ll get a chance to see just how much vision the New FCC has by their reaction to this proposal.
The New York Times reports that regulators have an interest in the structure of the Apple and Google boards of directors:
The Federal Trade Commission has begun an inquiry into whether the close ties between the boards of two of technologyâ€™s most prominent companies, Apple and Google, amount to a violation of antitrust laws, according to several people briefed on the inquiry.
I doubt this will go very far, as the interlocking directors (Eric Schmidt and former Genentech CEO Arthur Levinson,) will simply resign before any enforcement action is imminent, but it does raise some interesting questions about the market for mobile phone operating systems, currently split between Apple, Google, Microsoft, Palm, and a few others. These systems are rife with limitations, each of which could be considered a network neutrality violation when viewed in just the right way.
I imagine Apple itself might wish to give Dr. Schmidt his walking papers before he becomes an anti-trust problem, which he actually isn’t at this point. The FTC’s interest in this obscure situation is probably a signal that the Administration wants to be viewed as an anti-trust hawk without doing anything substantial.
But this is what the law calls an “occasion of sin.” Dear me.
Here’s the video of the panel I was on at the Congressional Internet Caucus Advisory Committee’s “State of the Mobile Net” conference in DC last Thursday. This was the closing panel of the conference, where all the loose ends were tied together. For those who don’t live and breath Washington politics, I should do what moderator Blair Levin didn’t do and introduce the panel. Levin was the head of the TIGR task force for the Obama transition, the master group for the review of the regulatory agencies and the administration’s use of technology. Kevin Werbach is a professor at the Wharton School, and took part in the FCC review for the transition along with Susan Crawford. He runs the Supernova conference. Larry Irving was part of the review of NTIA for the transition, and is a former Assistant Secretary of Commerce. Ben Scott is the policy guy at Free Press, and Alex Hoehn-Saric is legal counsel to the Senate Committee on Commerce, Science and Transportation.
Regulatory policy needs to be technically grounded, so I emphasized the tech side of things.
While California was sleeping, I enjoyed a bit of broadband politics in the heart of the beast, testifying at the House Subcommittee on Communications, Technology, and the Internet on Communications Networks and Consumer Privacy: Recent Developments
The Subcommittee on Communications, Technology, and the Internet held a hearing titled, “Communications Networks and Consumer Privacy: Recent Developments” on Thursday, April 23, 2009, in 2322 Rayburn House Office Building. The hearing focused on technologies that network operators utilize to monitor consumer usage and how those technologies intersect with consumer privacy. The hearing explored three ways to monitor consumer usage on broadband and wireless networks: deep packet inspection (DPI); new uses for digital set-top boxes; and wireless Global Positioning System (GPS) tracking.
* Ben Scott, Policy Director, Free Press
* Leslie Harris, President and CEO, Center for Democracy and Technology
* Kyle McSlarrow, President and CEO, National Cable and Telecommunications Association
* Dorothy Attwood, Chief Privacy Officer and Senior Vice President, Public Policy, AT&T Services, Inc.
* Brian R. Knapp, Chief Operating Officer, Loopt, Inc.
* Marc Rotenberg, Executive Director, The Electronic Privacy Information Center
* Richard Bennett, Publisher, BroadbandPolitics.com
It went pretty well, all in all; it’s really good to be last on a panel, and the Reps aren’t as snarky as California legislators. I’ll have more on this later.