The New York Times Takes Our Name in Vain

At least they spelled our name right. The Price of Broadband Politics is the title of a New York Times editorial on the lobbying that’s taking place around broadband Internet regulation that sounds the usual cliche themes about money in politics:

Comcast has spent more than $2 million on campaign donations; Verizon has given $1.2 million. The National Cable and Telecommunications Association — the industry’s collective lobbying group — has spent about $1 million more. And just in case that isn’t persuasive enough of the ills of government regulation, telephone and cable companies spent $20.6 million lobbying the government in the first quarter of the year.

Never mind that money spent on contributions is entirely different from money spent on lobbying, it’s the dollar signs that the Times sees, and only those on one side of the debate. So what happens if regulated industries are forbidden from lobbying? The industries who see a benefit from spinning the regulations a certain way will still lobby, and voices like that of the New York Times editorial page will be all the louder. The Times perceives its self-interest, rightly or wrongly, to depend on these regulations, and it’s spending its own money to advocate for its interests on its editorial page. God forbid its opponents who don’t own printing presses should do the same.

Wrong Way

The FCC’s “Third Way” rhetoric is especially interesting to ITIF because the notion that a third way was needed is something ITIF president Rob Atkinson and current Obama advisor Phil Weiser introduced in a 2006 paper. The rhetoric of the third way doesn’t align with the use of a Title II classification, however, because Section 202 has the simplistic “anti-discrimination” construction that’s telephone-specific. Packet-switched networks employ discrimination to do constructive things, so the policy issues are around the sale and transparency of discrimination as a service, not the mere fact of its existence.

The FCC is also usurping the Congressional role and defining its own mandate. See the ITIF statement:

The Federal Communications Commission, the government agency charged by Congress with regulating communications by air and wire, announced today a sweeping new program that goes far beyond its mandate. The FCC’s move is likely to lead to a lengthy and unnecessary legal battle, create needless uncertainty in the market, and detract from the FCC’s important work in implementing the recently unveiled national Broadband Plan. While the FCC is attempting to create a regulatory framework suitable for the ever changing Internet ecosystem, its proposal is tantamount to going duck hunting with a cannon.

This is a story that has become all too familiar. In the recent past, the courts have struck down punitive FCC orders against the Super Bowl “wardrobe malfunction” and on, April 6, an overwrought ruling against cable operator Comcast, who sought to preserve good Internet performance for those of its customers who use Voice over Internet Protocol (VoIP) services such as Skype and Vonage. This most recent example of FCC over-reach is a proposal that would take broadband Internet services out of their present status as lightly-regulated “information services” (Title I) and plunk them into a regulatory system devised for the monopoly telephone networks of the 1930s (Title II).

Read the whole thing.

FCC Regulates Internet, Film Here

News leaked out earlier today to the effect that the FCC has decided to pursue a Title II regulatory program for the Internet, treating it in effect as if it were a telephone network. Others have called this approach “the nuclear option,” but I think it’s less severe, more like the 9/11 attacks on New York and Washington. Telecom lawyers will prosper from it, as a move of this kind is likely to take many years of court battles to squelch. Here’s a little discussion I had with a small circle of friends at the TechCrunch pad this afternoon.

Enjoy.

The Next Big Thing

I started working on the system architecture and protocols for Wi-Fi in late 1990, when I consulted with Photonics, a little start-up in Los Gatos that had already built the first commercial wireless LAN. The initial Photonics product was a short distance, infrared-based, wire replacement for Apple Talk, and the second generation system was Wi-Fi over infrared. Most people don’t remember that IEEE 802.11 was a single Medium Access Control protocol and two physical layers, one infrared and the other RF. The RF PHY was obviously more successful than the IR version.

Photonics had two large customers, IBM and Toshiba, both of whom wanted to do the same thing with the wireless LAN: integrate it into touchscreen, portable computers. IBM’s portable computer was called Think Pad, and as the name suggests, it was tablet computer with which the user would interact through a stylus; Toshiba had a similar idea. The user interface was based on gestures and handwriting recognition, a very rough science in those days, and the underlying system was Windows 3.15 or so, a keyboard-and-mouse system. These systems were challenged by limited battery life – like 2 hours between charges – and the slow processors of the day. So we figured out how to build a wireless LAN, but didn’t quite end up with a system that could take advantage of it. It wasn’t until nearly 10 years later that CPUs, PC technology in general, and batteries developed to the point that a fully portable computer was really practical, and by then Windows had established dominance in corporate America so the clamshell design that laptops use today won out over the more personal concept of the tablet. Clamshells weren’t all that revolutionary as a concept by the 90s, given clamshell prototypes were built at Texas Instruments, IBM, and other places in the 1970s. Just ’cause you can built a prototype doesn’t mean you can build a product, however: For portables, the technology has to be there to get decent performance, weight, and operating life from a charge. This is very clear in the case of handhelds, and all geeks are familiar with promising devices that never went anywhere because they failed on one of these three dimensions. The Google G1 phone drained its battery too fast, for example, so it was a total flop despite being an otherwise very nice device.

One interesting development that’s been taking place over the past couple of years is the convertible laptop, a clamshell computer with a touch-sensitive screen that swivels to turn the whole thing into a tablet. The Dell Latitude XT2 is the best example of this sort of system today. Dell Latitude XT2

This is nice machine, with a dual-core Intel CPU, several hours of battery life, and both pen and multitouch input. It weighs about 4 pounds with the serious battery, which is great for a laptop, and has all the nice connectivity options such as 802.11n and 3G from various sources. Nicely-equipped, it will set you back about $3,000 and it’s probably worth every penny of it if you need the features and functions it offers, especially handwriting recognition and a full-blown Windows platform. I’ve been tempted to buy one of these, to the extent of researching the refurbs you can get on eBay for half the retail price, but haven’t pulled the trigger. The price, weight, and the general concept as a Windows extension ultimately turned me off.

So I was pretty excited to learn that Apple was building a tablet computer. Apple is the one company with all the capabilities in hardware, software, user interfaces, and vision to develop a personal computing device that breaks new ground and isn’t chained to the past. Of all the personal technology companies in the world, Apple is the one with the least reverence for the traditional ways of doing things and the greatest autonomy in system design. Dell, HP, and the others are ultimately constrained by the dependence on either Microsoft or Linux for software, and it’s hard to push either of these highly successful enterprises into a totally new space very quickly.

As the specs came out, it became clear that Apple gets the limitations of the tablets of the past: the weight was only 1.5 pounds, the battery lasts all day (doing serious work, like video rendering), the connectivity options are all in place in terms of dual-band 802.11n and 3G + GPS for the model shipping later this month. The storage is all solid-state, and the OS is a slightly scaled-up version of the scaled-down Mach system known as OS X by Apple. The price is pretty appealing too, at $729.00 for the 3G model with a 32GB SSD.

The announcement was well in advance of the ship date, so I did what any sensible person would do, bought some Apple stock. The stock has now gone up enough to pay for an iPad 3G with a nice set of accessories and to cover taxes on the gain. So I put in the order at Apple’s on-line store for delivery sometime later this month. So we’ll see how it goes. If I don’t like it, I’ll send it back, probably without a vanity video about how disappointed I am in Steve Jobs. And yes, I expect that the second generation model will be faster, cheaper, 4G-enabled, multi-tasking, and camera-equipped. By then, I will have had 6 months or year’s worth of use from the one I’ve ordered, so that’s not too bad. I bought one of the first Macs in the first 100 days as well, since I liked all that bitmappy, mousey, windowey GUI stuff, and used it for 3 years or so before trading up to a Mac SE and then to a machine that would run Windows 95 at a decent speed.

I expect that we’re about three to five years away from general-purpose tablets that don’t need hands-on curated app stores to ensure consistency and quality in their user interfaces, so at some point the prophets of gloom will be able to buy a lightweight, fully functional, open system that does all the things they need.

Contrary to popular opinion, the cycle of innovation doesn’t always move from open devices to closed ones, of course. What we really see in the long arc of platform innovation is that closed devices like the Xerox Star and the iPhone lead the way, only to be cannibalized over the long term by open systems built on generic technology like Wintel machines.

When a new paradigm is emerging, however, the trade-offs between power, usability and cost are too fragile to accommodate the looseness in interface design and the over-engineering you need to have in order to accommodate unknown apps. We’ll get to open tablets eventually, but only as the hardware and software of the underlying platforms develop to the point that we can afford the overhead to provide openness. And by then, Apple will be pioneering a whole new concept in personal communication, computing, and entertainment. This is as it should be, of course.

Assertions without Fact

Eric Schmidt made an interesting point about Washington, DC think tanks recently:

“I spend so much time in Washington now because of the work that I’ve been doing, I deal with all these people who make assertions without fact,” he said. Policy people “will hand me some report that they wrote or they’ll make some assertion, and I’ll say, ‘Well, is that true?’ — and they can’t prove it.”

Perhaps that could change some day, he suggested. Technology could help.

With Google’s vast power for capturing and remembering data, Schmidt painted a picture in which technology could help quantify and verify the assertions made in policy documents. “Government is highly measurable, most of it,” he said. “We can actually see how many people got this shot or read this report or so forth. A government — a transparent government — should be able to [measure] that.”

He’s absolutely right, of course. Policy has a number of sacred cows because it’s a political process, and the last thing Congress ever does is follow-up on the measures it enacts to see whether they produce the desired results. So I challenge my colleagues in the think tank business to support assertions with evidence, and to cite longitudinal studies when they exist. This is the road to good policy.

Going Mobile: Technology and Policy Issues in the Mobile Internet

I’m presenting a report on the Mobile Internet at the ITIF Global Command Center in Washington bright and early Tuesday morhing:

The Internet is changing. In a few short years, Internet use will come predominately from mobile devices such as smartphones and tablets rather than traditional PCs using fixed broadband. A fully mobile broadband Internet offers exciting opportunities for innovation in networks, devices, and applications with enormous benefits for the economy and society.

The shift from a wire-centric Internet to a mobile one has profound implications for technology, policy, and applications. A new report by ITIF Research Fellow Richard Bennett explains how mobile networks are changing as they become part of the Internet, the implications mobile networking has for public policy, and how policymakers can facilitate the transition to mobile broadband.

Join us for the presentation of the report and a panel discussion among leading representatives of diverse viewpoints on Internet policy.

Date: Tuesday, March 2, 2010
Time: 9:00am- 10:30am
Location: 1101 K Street Suite 610A Washington, DC 20005

Presenter

Richard Bennett
Research Fellow, The Information Technology and Innovation Foundation
Respondents

Harold Feld
Legal Director, Public Knowledge

Morgan Reed
Executive Director, Association for Competitive Technology

Barbara Esbin
Senior Fellow and Director, Center for Communications and Competition Policy, PFF

Click here to RSVP.

Speaking of privacy

I went to the FTC’s second privacy workshop yesterday in Berkeley, and found it a generally interesting and worthwhile event, although it did exhibit some of the familiar patterns. Privacy, like net neutrality, isn’t as much a coherent issue as a grab-bag of grievances about a number of loosely connected concerns. Privacy is even more diverse and more incoherent than NN, which is after all driven by the desire to preserve traditional features of the Internet. Privacy seeks to change Internet tradition, which has never had any meaningful privacy but has simply created a sufficiently strong illusion of anonymity to make some people think there’s privacy on the net.

So what you have in privacy is two major issues of totally different character: (1) the capture of fleeting personal information by various services; and (2) the building of databases of personal activity and the subsequent analysis, use, and sale of the information they contain. These issues have to be resolved against the background of the Internet’s defective security architecture and tradition of people using handles instead of real names. When people feel anonymous, they misbehave, which is why there’s no much theft and generally churlish behavior on the net.

Congress is looking into these issues as well, and toward that end has held several hearings. I’m attaching testimony I delivered at one of these last Spring for your enjoyment. It holds up pretty well.

Open Internet Rules

Incidentally, ITIF filed comments with the FCC in the Open Internet rule-making:

The FCC should proceed with caution in conducting its inquiry into Open Internet rules, according to comments filed by the Information Technology and Innovation Foundation today. All the evidence suggests that the Internet is thriving: network operators are investing and new applications, devices, services, and content are emerging at a dizzying rate. While there is a need to clarify the confused state of Internet regulation in the United States, there’s no compelling public interest for the FCC to adopt a stringent new regulatory framework. The Commission would do well to follow the example of fellow regulators in Canada and Europe who have recently concluded that the most sensible course for national regulators is to emphasize disclosure of terms of service and oversight of business and technical practices.

ITIF rejects the argument that the FCC lacks jurisdiction to regulate the Internet, but urges the Commission to carefully consider the evidence before enacting new regulations on Internet access services. The Internet is a complex “virtual network” designed to serve a variety of needs, and as such it does not readily lend itself to traditional telecom regulatory models. The Internet requires regulators to take a fresh approach. The first step for the Commission is to conduct a fair and probing analysis about how the Internet works today.

ITIF applauds the Commission for committing to an open process and feels that careful examination will lead to the conclusion that the Internet is fundamentally healthy.

The big issues here are that we’re not done with network engineering, nor are we done with developing the business models that make the most of network investments. So the companies who develop the insides of the Internet need to continue cooperating with the people who develop the outsides. The Verizon/Google, Comcast/BitTorrent and AT&T/Apple partnerships are instructive.

, ,