Has the FCC Created a Stone Too Heavy for It to Lift?

After five years of bickering, the FCC passed an Open Internet Report & Order on a partisan 3-2 vote this week. The order is meant to guarantee that the Internet of the future will be just as free and open as the Internet of the past. Its success depends on how fast the Commission can transform itself from an old school telecom regulator wired to resist change into an innovation stimulator embracing opportunity. One thing we can be sure about is that the order hasn’t tamped down the hyperbole that’s fueled the fight to control the Internet’s constituent parts for all these years.

Advocates of net neutrality professed deep disappointment that the FCC’s rules weren’t more proscriptive and severe. Free Press called the order “fake net neutrality,” Public Knowledge said it “fell far short,” Media Access Project called it “inadequate and riddled with loopholes,” and New America Foundation accused the FCC of “caving to telecom lobbyists.” These were their official statements to the press; their Tweets were even harsher.

Free marketers were almost as angry: Cato denounced the order as “speech control,” Washington Policy Center said it “fundamentally changes many aspects of the infrastructure of the Internet,” and the Reason Foundation said it will lead to “quagmire after quagmire of technicalities, which as they add up will have a toll on investment, service and development.”

Republican Congressional leaders made no secret of their displeasure with the FCC’s disregard for their will: Rep. Fred Upton (R, Michigan,) the incoming Commerce Committee Chairman called it a “hostile action against innovation that can’t be allowed to stand,” Rep. Greg Walden (R, Oregon,) incoming Chairman of the Subcommittee on Communications and Technology called it a “power grab,” and vowed to hold hearings to overturn it, while Sen. Kay Bailey Hutchison (R, Texas,) Ranking Member of the Senate Commerce, Science, and Transportation Committee said the order “threatens the future economic growth of the Internet.” Setting Internet policy is indeed a Congressional prerogative rather than an agency matter, so the longer-term solution must come from the Hill, and sooner would be better than later.

Contrary to this criticism and to snarky blogger claims, not everyone was upset with the FCC’s action, coming as it did after a year-long proceeding on Internet regulation meant to fulfill an Obama campaign pledge to advance net neutrality. The President himself declared the FCC action an important part of his strategy to “advance American innovation, economic growth, and job creation,” and Senator John Kerry (D, Massachusetts) applauded the FCC for reaching consensus.

Technology industry reaction ranged from positive to resigned: Information Technology Industry Council President and CEO Dean Garfield declared the measure “ensures continued innovation and investment in the Internet,” TechNet supported it, and National Cable and Telecommunications Association head Kyle McSlarrow said it could have been much worse. At the Information Technology and Innovation Foundation, we were pleased by the promises of a relatively humble set of the rules, less so with the final details; we remain encouraged by the robust process the FCC intends to create for judging complaints, one that puts technical people on the front lines. In the end, the order got the support of the only majority that counts, three FCC commissioners.

Most of us who reacted favorably acknowledged the FCC’s order wasn’t exactly as we would have written it, but accepted it as a pragmatic political compromise that produces more positives than negatives. The hoped-for closing of the raucous debate will have immense benefits on its own, as simply bringing this distracting chapter in the Internet’s story to an end will allow more time for sober discussion about the directions we’d like the Internet to take in its future development. There is no shortage of policy issues that have been cramped by the tendency to view net neutrality as the one great magic wand with the power to solve all the Internet’s problems: The FCC has work to do on freeing up spectrum for mobile networking, the Universal Service Fund needs to be reformed, and the National Broadband Plan needs to be implemented.

If the FCC’s approach proves sound, it might well be exported to other countries, forming the basis of a consistent international approach to the oversight of an international network developed on consistent standards of its own. Such an outcome would have positive consequences for the Internet standards community, which has its own backlog of unfinished business such as scalable routing, congestion management, security, and the domestication of peer-to-peer file sharing and content delivery networks to resolve. This outcome is far from inevitable; last minute rule changes make it less likely than it might have been.

The most important thing the FCC can do in implementing its system of Internet oversight is to elevate process over proscriptive rules. The traditional approach to telecom regulation is to develop a thick sheath of regulations that govern everything from the insignias on the telephone repair person’s uniform to the colors of the insulators on RJ11 cables and apply them in top-down, command-and-control fashion. Many of those on the pro-net neutrality side are steeped in telecom tradition, and they expected such an approach from the FCC for the Internet; theirs are the angry reactions.

But the Internet isn’t a telecom network, and a foot-high stack of regulations certainly would produce the negative consequences for innovation and progress the FCC’s critics have forecast. The appropriate way to address Internet regulation as to follow the model that the Internet has developed for itself, based on a small number of abstract but meaningful principles (each of which is subject to change for good reason) applied by a broad-based community of experts in a collaborative, consultative setting. Internet standards are not devised in an adversarial setting populated by angels and devils locked into mortal combat; they come from a process that values “rough consensus and running code.”

The specifics of the FCC’s order nevertheless give pause to those well-schooled in networking. A few hours before the Commission’s vote, Commissioner Copps persuaded Chairman Genachowski to reverse the Waxman Bill’s presumption regarding the premium transport services that enable Internet TV and video conferencing to enjoy the same level of quality as cable TV. Where the early drafts permitted these services as long as they were offered for sale on a non-discriminatory basis, the final rule arbitrarily presumes them harmful.

The order makes hash of the relationship of the content accelerators provided by Akamai and others to the presumptively impermissible communication accelerators that ISPs might provide one day in order to enable HD group video conferencing and similar emerging applications. The Commission majority fears that allowing network operators to offer premium transport to leading edge apps will put the squeeze on generic transport, but fails to consider that such potential downsides of well-accepted technical practices for Quality of Service can be prevented by applying a simple quota limit on the percentage of a pipe that can be sold as “premium.” This fact, which is obvious to skilled protocol engineers, goes unmentioned in the order.

The poor reasoning for this rule casts doubt on the FCC’s ability to enforce it effectively without outside expertise. By rejecting Internet standards such as RFC 2475 and IEEE standards such as 802.1Q that don’t conform to the telecom activists’ nostalgic, “all packets are equal” vision of the Internet, the FCC chose to blind itself to one of the central points in Tim Wu’s “Network Neutrality, Broadband Discrimination” paper that started the fight: A neutral Internet favors content applications, as a class, over communication applications and is therefore not truly an open network. The only way to make a network neutral among all applications is to differentiate loss and delay among applications; preferably, this is done by user-controlled means. That’s not always possible, so other means are sometimes necessary as well.

All in all, the Commission has built a stone too heavy for it to lift all by itself. The rules have just enough flexibility that the outside technical advisory groups that will examine complaints may be able to correct the order’s errors, but to be effective, the advisors need much deeper technical knowledge than the FCC staffers who wrote the order can provide.

It’s difficult to ask the FCC – an institution with its own 75 year tradition in which it has served as the battleground for bitter disputes between monopolists and public interest warriors – to turn on a dime and embrace a new spirit of collaboration, but without such a far-reaching institutional transformation its Internet regulation project will not be successful. Those of us who work with the FCC are required to take a leap of faith to the effect that the Commission is committed to transforming itself from a hidebound analog regulator into a digital age shepherd of innovation. Now that the Open Internet Report & Order has passed, we have no choice but to put our shoulders to the rock to help push it along. There’s no turning back now.

[cross-posted from the Innovation Policy Blog]

Premium Services

See my post at High Tech Forum, A Question of Priorities on a discussion Jerry Brito of Mercatus started yesterday:

A very interesting part of Jerry’s argument is that as the Internet is a best-efforts network, it must be impossible to prioritize across it. That leads to this speculation about changing the “dozens or hundreds of networks a packet traverses in its travels from sender to recipient.” In fact, the typical Internet packet crosses about 18 hops between source and destination, but only three or four networks on average.

A lot of people have the idea that the Internet is some sort of warm and fuzzy cloud of altruism in which people carry packets as a public service; Jonathan Zittrain promotes the idea that it’s like passing hot dogs down the line at a baseball game. According to this notion, when a Verizon customer in Boston sends a packet to an AT&T customer in California, a completely unrelated group of organizations carry the packet without any economic interest in it. So the prioritization scheme would need to be endorsed by all of them or it wouldn’t work.

This is wrong, actually.

That’s clear enough, isn’t it?

Wrong Way

The FCC’s “Third Way” rhetoric is especially interesting to ITIF because the notion that a third way was needed is something ITIF president Rob Atkinson and current Obama advisor Phil Weiser introduced in a 2006 paper. The rhetoric of the third way doesn’t align with the use of a Title II classification, however, because Section 202 has the simplistic “anti-discrimination” construction that’s telephone-specific. Packet-switched networks employ discrimination to do constructive things, so the policy issues are around the sale and transparency of discrimination as a service, not the mere fact of its existence.

The FCC is also usurping the Congressional role and defining its own mandate. See the ITIF statement:

The Federal Communications Commission, the government agency charged by Congress with regulating communications by air and wire, announced today a sweeping new program that goes far beyond its mandate. The FCC’s move is likely to lead to a lengthy and unnecessary legal battle, create needless uncertainty in the market, and detract from the FCC’s important work in implementing the recently unveiled national Broadband Plan. While the FCC is attempting to create a regulatory framework suitable for the ever changing Internet ecosystem, its proposal is tantamount to going duck hunting with a cannon.

This is a story that has become all too familiar. In the recent past, the courts have struck down punitive FCC orders against the Super Bowl “wardrobe malfunction” and on, April 6, an overwrought ruling against cable operator Comcast, who sought to preserve good Internet performance for those of its customers who use Voice over Internet Protocol (VoIP) services such as Skype and Vonage. This most recent example of FCC over-reach is a proposal that would take broadband Internet services out of their present status as lightly-regulated “information services” (Title I) and plunk them into a regulatory system devised for the monopoly telephone networks of the 1930s (Title II).

Read the whole thing.

Going Mobile: Technology and Policy Issues in the Mobile Internet

I’m presenting a report on the Mobile Internet at the ITIF Global Command Center in Washington bright and early Tuesday morhing:

The Internet is changing. In a few short years, Internet use will come predominately from mobile devices such as smartphones and tablets rather than traditional PCs using fixed broadband. A fully mobile broadband Internet offers exciting opportunities for innovation in networks, devices, and applications with enormous benefits for the economy and society.

The shift from a wire-centric Internet to a mobile one has profound implications for technology, policy, and applications. A new report by ITIF Research Fellow Richard Bennett explains how mobile networks are changing as they become part of the Internet, the implications mobile networking has for public policy, and how policymakers can facilitate the transition to mobile broadband.

Join us for the presentation of the report and a panel discussion among leading representatives of diverse viewpoints on Internet policy.

Date: Tuesday, March 2, 2010
Time: 9:00am- 10:30am
Location: 1101 K Street Suite 610A Washington, DC 20005

Presenter

Richard Bennett
Research Fellow, The Information Technology and Innovation Foundation
Respondents

Harold Feld
Legal Director, Public Knowledge

Morgan Reed
Executive Director, Association for Competitive Technology

Barbara Esbin
Senior Fellow and Director, Center for Communications and Competition Policy, PFF

Click here to RSVP.

Open Internet Rules

Incidentally, ITIF filed comments with the FCC in the Open Internet rule-making:

The FCC should proceed with caution in conducting its inquiry into Open Internet rules, according to comments filed by the Information Technology and Innovation Foundation today. All the evidence suggests that the Internet is thriving: network operators are investing and new applications, devices, services, and content are emerging at a dizzying rate. While there is a need to clarify the confused state of Internet regulation in the United States, there’s no compelling public interest for the FCC to adopt a stringent new regulatory framework. The Commission would do well to follow the example of fellow regulators in Canada and Europe who have recently concluded that the most sensible course for national regulators is to emphasize disclosure of terms of service and oversight of business and technical practices.

ITIF rejects the argument that the FCC lacks jurisdiction to regulate the Internet, but urges the Commission to carefully consider the evidence before enacting new regulations on Internet access services. The Internet is a complex “virtual network” designed to serve a variety of needs, and as such it does not readily lend itself to traditional telecom regulatory models. The Internet requires regulators to take a fresh approach. The first step for the Commission is to conduct a fair and probing analysis about how the Internet works today.

ITIF applauds the Commission for committing to an open process and feels that careful examination will lead to the conclusion that the Internet is fundamentally healthy.

The big issues here are that we’re not done with network engineering, nor are we done with developing the business models that make the most of network investments. So the companies who develop the insides of the Internet need to continue cooperating with the people who develop the outsides. The Verizon/Google, Comcast/BitTorrent and AT&T/Apple partnerships are instructive.

, ,

Chairman Genachowski Goes to San Francisco

GigaOm sponsored a conversation with FCC Chairman Julius Genachowki at their Intergalactic Headquarters in San Francisco today.

Watch live streaming video from gigaomtv at livestream.com

I asked the net neutrality question toward the end, and applauded the Chairman for the way he’s transformed the FCC. Genachowski brought some of his best staffers with him, and it was nice to meet and greet and share ideas. You have to admire anyone who can make such deep changes to a rather hidebound federal agency as quickly as Genachowski and staff have done.

, ,

Blair Levin Hints at National Broadband Plan

Amy Schatz of the WSJ joined in the questioning of Blair Levin on this week’s installment of The Communicators. Here’s an interesting part of her story:

Mr. Levin also dismissed criticisms last week from public interest groups unhappy the plan may not propose some ideas for encouraging competition, such as rules that would require Internet providers to share their lines with competitors.

“I find their criticism not very productive,” Mr. Levin said Monday.

FCC officials have been considering the ideas, some of which were laid out in a FCC-commissioned report by Harvard University’s Berkman Center for Internet & Society.

The report suggests that other countries have faster, cheaper broadband because they adopted open access, line-sharing rules years ago. But FCC officials appear to have backed away from the open access idea in recent weeks.

“The Berkman (study) did a fantastic job of pointing out what’s going on around the world,” Mr. Levin said. “There are certain things where what’s going on in other countries really isn’t germane for where we go from here.

The video is already up at the C-Span site.

Levin gets the private investment angle, and stresses the Columbia study over the Berkman study.

, ,

Speech, Democracy, and Open Internet Regulations

The video of the FCC workshop on Speech, Democratic Engagement, and the Open Internet is up on the FCC’s web site already. I can’t say there was much enlightening dialog in this event; it was pretty much the same tired old rhetoric we’ve heard for the last four years on the subject, with some exceptions.

One speaker, Bob Corn-Revere, was very good, quite clear about the potential dangers of the proposed anti-discrimination rule, and another, Glenn Reynolds, briefly mentioned reservations about them but didn’t amplify. Another speaker denounced volume-based pricing as a racist practice, and several others displayed astonishing ignorance about the nature of information bottlenecks on the Internet by way of proposing different rules for sites like YouTube and search services than those that would apply to ISPs. The reality is that people don’t stream video from their home computers today because of capacity limits, so any attempt to free video streams from content-based restrictions has to start with the services that people use to locate and host these streams.

So the workshop was pretty much a waste of time unless you just awoke from a five year long coma. Not that the FCC meant for it to be, of course, just that there wasn’t much there. And to make matters worse, the written testimony is not available from the FCC, but thanks to PFF you can see Bob Corn-Revere’s statement here.

, ,

Free Speech for Me, But When it Comes to Thee I Need to Think About It

The FCC will hold an upcoming workshop on free speech and net neutrality regulations that features a really interesting array of speakers:

Michele Combs from the Christian Coalition; Glenn Reynolds, Instapundit; Jonathan Moore, Rowdy Orbit; Ruth Livier, YLSE; ; Garlin Gilchrist, Center for Community Change; Bob Corn-Revere, Davis Wright Tremaine; Jack Balkin, Yale Law School; and Andrew Schwartzman, Media Access Project.

“Interesting” in that most of* this group shares a common viewpoint to the effect that net neutrality regulations are necessary to protect free speech on the Internet. This is not the only viewpoint that exists on the subject, of course: there are many of us who believe that the proposed framework of regulations is at best neutral to free expression and under many plausible outcomes, positively harmful.

The reason for this is that the proposed anti-discrimination rule makes it illegal for ISPs to sell enhanced transport to publishers who require it to deliver high bandwidth, live interactive services to people on the Internet. A broad non-discrimination rule pretty well confines the future Internet to the range of applications it supports today, low-bandwidth interaction and static content, and even those are in doubt on wireless access networks with limited bandwidth.

The Genachowski FCC has been very good so far on putting panels together with diverse viewpoints, so the stark failure of the Commission to respect viewpoint diversity in this particular case is rather surprising. It is particularly ironic that on a panel devoted to viewpoint diversity, in essence, that the Commission has chosen viewpoints that represent unanimity rather than diversity.

UPDATE: One thing I have to say about the FCC is that it’s a very responsive agency. I sent an e-mail to the panel coordinator late Friday complaining about the panel’s lack of diversity, and despite the fact that it was sent after business hours on Friday, I got a response today in the form of a phone call from an FCC staffer. The explanation they offer is that this panel is simply meant to cover Internet openness, and there will be additional panels on the issues I’ve raised from January to March. So the issue of whether new rules are needed to protect free speech will be covered in these future panels, and doesn’t need any discussion right now, per the FCC’s viewpoint.

The scheduling is hard to fathom. Earlier this week, there was a technical panel in which academics, operators, and equipment vendors with different viewpoints on net neutrality regulations educated Commission staff on Internet organization and traffic. That panel had people who range all the way from strong supporters of the regulations to strong opponents, but they didn’t explore the policy space directly. The upcoming panel simply happens to be more uniform in its views, but their charter is to explain how they benefit from Internet openness.

In the overall scheme of things, the Internet is not actually more open than many other networks with which we’re familiar, of course; the telephone network permits anyone to communicate with anyone, as did the telegraph network and as does the US mail. And you can’t do anything you want on the Internet, you have to abide by the law.

To the extent that the Internet is not open, it’s chiefly government that closes off particular avenues of expression: The obvious examples are the DMCA’s anti-piracy provisions, the US ban on kiddie porn, Germany’s ban on Nazi organizing and Scientology, and China’s ban on access to native Google searches. Each government has decided on policy grounds to close the Internet in ways that suit its interests, so if the regulations simply focus on commercial restrictions and enablements of forms of Internet-based speech and don’t restrict the power of the FCC to issue ex post and ex ante regulations, we won’t have accomplished much in this process.

The area of controversy is in between the technical issues discussed in the first workshop and the openness issues that will be discussed Tuesday. And as we will see, the advocates of net neutrality don’t understand enough about the Internet’s operation and potential to have much insight into whether and how it’s going to be regulated going forward.

*UPDATE 2: At least one of the speakers will in fact caution the Commission about diving in with the new regulations without clear evidence of harm.

What’s Cooking in Europe

I’ve been spending some time in Europe recently. A couple of weeks ago I took part in a roundtable at the Karlsruhe Inst. of Technology in Germany on open spectrum that combined one of most interesting gatherings of people of different viewpoints and ranges of expertise ever assembled in one setting. The group included a former chief national regulator, the technologist who wrote the first IEEE 802 standard for beam-forming, a very serious grad student working with Software-Defined Radios, as well as a number of legal academics and economists. Together we explored the obstacles and value of the wireless third pipe, including the research problems that will need to be solved to make it a reality. This is the kind of gathering that’s rarely assembled in the USA.

And more recently, I took part in a series of presentations and a general discussion about openness on the wireless Internet. One of the other presenters was one of the Pirate Party’s Members of the European Parliament, and others were the top strategic thinkers and managers from TeliaSonera and Huchison Whampoa Europe. This event followed on the passage of the EU Telecoms Package that wisely added a disclosure rule to the European Common Law and just as wisely refrained from adding an anti-discrimination rule. Did you know that Huchison offers a 3G-only phone with Skype pre-installed? They do, and it took them a lot of work to get Skype to run successfully on it.

A year ago, I would have said that Europe was trailing the US on the regulatory front, but today it certainly appears they’re on a more sensible course than we are in many respects. It’s important for a regulator to be humble and not approach his task with too much enthusiasm and creativity. These are fine traits in an entrepreneur, but in the hands of government can lead to grief. It’s best that we each remember our respective roles, in other words. It’s in the nature of technology to change, and regulations that are too prescriptive alter the natural order of things.