I gave a presentation at eComm last week on the challenges in building a mobile Internet building on themes I explored in my recent ITIF report, Going Mobile. As I didn’t have much time, I skipped over some of the policy content, so I’m uploading my slides for interested parties to peruse.
Category: Wireless
Open Internet Rules
Incidentally, ITIF filed comments with the FCC in the Open Internet rule-making:
The FCC should proceed with caution in conducting its inquiry into Open Internet rules, according to comments filed by the Information Technology and Innovation Foundation today. All the evidence suggests that the Internet is thriving: network operators are investing and new applications, devices, services, and content are emerging at a dizzying rate. While there is a need to clarify the confused state of Internet regulation in the United States, there’s no compelling public interest for the FCC to adopt a stringent new regulatory framework. The Commission would do well to follow the example of fellow regulators in Canada and Europe who have recently concluded that the most sensible course for national regulators is to emphasize disclosure of terms of service and oversight of business and technical practices.
ITIF rejects the argument that the FCC lacks jurisdiction to regulate the Internet, but urges the Commission to carefully consider the evidence before enacting new regulations on Internet access services. The Internet is a complex “virtual network” designed to serve a variety of needs, and as such it does not readily lend itself to traditional telecom regulatory models. The Internet requires regulators to take a fresh approach. The first step for the Commission is to conduct a fair and probing analysis about how the Internet works today.
ITIF applauds the Commission for committing to an open process and feels that careful examination will lead to the conclusion that the Internet is fundamentally healthy.
The big issues here are that we’re not done with network engineering, nor are we done with developing the business models that make the most of network investments. So the companies who develop the insides of the Internet need to continue cooperating with the people who develop the outsides. The Verizon/Google, Comcast/BitTorrent and AT&T/Apple partnerships are instructive.
What’s Cooking in Europe
I’ve been spending some time in Europe recently. A couple of weeks ago I took part in a roundtable at the Karlsruhe Inst. of Technology in Germany on open spectrum that combined one of most interesting gatherings of people of different viewpoints and ranges of expertise ever assembled in one setting. The group included a former chief national regulator, the technologist who wrote the first IEEE 802 standard for beam-forming, a very serious grad student working with Software-Defined Radios, as well as a number of legal academics and economists. Together we explored the obstacles and value of the wireless third pipe, including the research problems that will need to be solved to make it a reality. This is the kind of gathering that’s rarely assembled in the USA.
And more recently, I took part in a series of presentations and a general discussion about openness on the wireless Internet. One of the other presenters was one of the Pirate Party’s Members of the European Parliament, and others were the top strategic thinkers and managers from TeliaSonera and Huchison Whampoa Europe. This event followed on the passage of the EU Telecoms Package that wisely added a disclosure rule to the European Common Law and just as wisely refrained from adding an anti-discrimination rule. Did you know that Huchison offers a 3G-only phone with Skype pre-installed? They do, and it took them a lot of work to get Skype to run successfully on it.
A year ago, I would have said that Europe was trailing the US on the regulatory front, but today it certainly appears they’re on a more sensible course than we are in many respects. It’s important for a regulator to be humble and not approach his task with too much enthusiasm and creativity. These are fine traits in an entrepreneur, but in the hands of government can lead to grief. It’s best that we each remember our respective roles, in other words. It’s in the nature of technology to change, and regulations that are too prescriptive alter the natural order of things.
Net Neutrality Regulations Coming
In FCC Chairman Genachowski’s long-anticipated statement on net neutrality rulemaking today, the Chairman made the claim that the Internet architecture is both unbiased and future-proof. However, as ITIF notes in a forthcoming report, “Designed for Change: End-to-End Arguments, Internet Innovation, and the Net Neutrality Debate”, the Internet’s architecture doesn’t make it future-proof, the process of experimentation and continual improvement does; rule making can seriously jeopardize Internet flexibility unless it’s undertaken with great care. In addition, it’s important to note that the Internet has always preferred some applications over others; it favors content over communication, for example. Network management is necessary as a means to overcome the Internet’s structural bias, so strict rules limiting network management to the mitigation of spam, malware, and attacks are not good enough. Carriers must be empowered to enable communications applications to compete equitably with content applications; only the carriers can provide fair access to diverse applications and users.
The approach to Internet regulation that focuses exclusively on the rights of consumers and the responsibilities of carriers belies the fact that the Internet invests substantial network control at the intelligent edge; the Internet gives each of us the power to be a producer as well as a consumer, and with that power comes responsibility. We can innovate without permission, but we all have to behave responsibly. It goes without saying that open access networks are desirable, so the real test of the FCC’s rulemaking will come from its assessment of both user behavior and operator management practices. We have every confidence that the Commission will undertake a serious, rigorous and fact-based rule making. The Internet enables innovation to the extent that carriers provide robust and reliable transport services to applications; if this capability is preserved and enhanced by a sensible network management framework, innovation will win.
What’s happening in Iran?
BusinessWeek isn’t buying the story that Twitter is the essential organizing tool for the protests in Iran over suspicious election results:
“I think the idea of a Twitter revolution is very suspect,” says Gaurav Mishra, co-founder of 20:20 WebTech, a company that analyzes the effects of social media. “The amount of people who use these tools in Iran is very small and could not support protests that size.”
Their assessment is that people are organizing the old-fashioned way, by word-of-mouth and SMS. Ancient technology, that SMS. But it is a great story, either way.
What slows down your Wi-Fi?
The Register stumbled upon an eye-opening report commissioned by the UK telecom regulator, Ofcom, on sources of Wi-Fi interference in the UK:
What Mass discovered (pdf) is that while Wi-Fi users blame nearby networks for slowing down their connectivity, in reality the problem is people watching retransmitted TV in the bedroom while listening to their offspring sleeping, and there’s not a lot the regulator can do about it.
Outside central London that is: in the middle of The Smoke there really are too many networks, with resends, beacons and housekeeping filling 90 per cent of the data frames sent over Wi-Fi. This leaves only 10 per cent for users’ data. In fact, the study found that operating overheads for wireless Ethernet were much higher than anticipated, except in Bournemouth for some reason: down on the south coast 44 per cent of frames contain user data.
When 90% of the frames are overhead, the technology itself has a problem, and in this case it’s largely the fact that there’s such a high backward-compatibility burden in Wi-Fi. Older versions of the protocol weren’t designed for obsolescence, so the newer systems have to take steps to ensure the older systems can see them, expensive ones, or collisions happen, and that’s not good for anybody. Licensed spectrum can deal with the obsolescence problem by replacing older equipment; open spectrum has to bear the costs of compatibility forever. So this is one more example of the fact that “open” is not always better.
What Policy Framework Will Further Enable Innovation on the Mobile Net?
Here’s the video of the panel I was on at the Congressional Internet Caucus Advisory Committee’s “State of the Mobile Net” conference in DC last Thursday. This was the closing panel of the conference, where all the loose ends were tied together. For those who don’t live and breath Washington politics, I should do what moderator Blair Levin didn’t do and introduce the panel. Levin was the head of the TIGR task force for the Obama transition, the master group for the review of the regulatory agencies and the administration’s use of technology. Kevin Werbach is a professor at the Wharton School, and took part in the FCC review for the transition along with Susan Crawford. He runs the Supernova conference. Larry Irving was part of the review of NTIA for the transition, and is a former Assistant Secretary of Commerce. Ben Scott is the policy guy at Free Press, and Alex Hoehn-Saric is legal counsel to the Senate Committee on Commerce, Science and Transportation.
Regulatory policy needs to be technically grounded, so I emphasized the tech side of things.
eComm Spectrum 2.0 Panel Video
Here’s the licensing panel from eComm live and in color. Seeing yourself on TV is weird; my immediate reaction is to fast for about a month.
On a related note, see Saul Hansell’s musings on spectrum.
The issue I wanted to raise at eComm and couldn’t due to lack of time and the meandering speculations about collision-free networks is spectrum sharing. Two-way communications systems all need a shared pipe at some level, and the means by which access to the pipe are mediated distinguish one system from another. So far, the debate on white spaces in particular and open spectrum in general is about coding and power levels, the easy parts of the problem. The hard part is how the system decides which of a number of competing transmitters can access the pipe at any given time. The fact that speculative coding systems might permit multiple simultaneous connections on the same frequency in the same space/time moment doesn’t make this question go away, since they only help point-to-point communications. Internet access is inherently a point-to-multipoint problem as theses system all aggregate wireless systems in order to move them to the fiber backbone.
The advantage of licensing is that it provides the spectrum with an authorized bandwidth manager who can mediate among the desires of competing users and ensure fairness per dollar (or some similar policy.) The idea that we can simply dispense with a bandwidth manager in a wide-area network access system remains to be proved.
So I would submit that one of the principles that regulators need to consider when deciding between licensed and unlicensed uses is the efficiency of access. The notion that efficiency can be discarded in favor of ever-fatter pipes is obviously problematic in relation to wireless systems; they’re not making more spectrum.
eComm Spectrum 2.0 Panel Transcript
Here’s a rough transcript of the Spectrum 2.0 – Exploring the Roots of Wireless Spectrum Controversy panel discussion from eComm 2009. Enjoy.
SxSW Wireless Meltdown
There’s nothing like a hoarde of iPhone users to kill access to to AT&T’s wireless network: my AT&T Blackberry Bold was nearly unusable at eComm because of the large number of iPhones in the room, and the situation at SxSW is roughly the same. The silver lining in Austin this week is that the show’s Wi-Fi network is working well. Part of the trick is the deployment of Cisco 1252 Access Points with 5 GHz support. Unlike the Bold, iPhones can’t operate on 5 GHz channels, so all that spectrum is free for the taking by Bolds and laptops that can operate on it. In a concession to Macbook users who aren’t allowed to select a Wi-Fi band, the show net had different ESSID’s for 2.4 and 5 GHz operation. It also has a load of reasonable restrictions:
Acceptable Use Policy
The Wireless network at the Convention Center is designed for blogging, e-mail, surfing and other general low bandwidth applications. It is not intended for streaming of any sort.
a) Peer-to-peer traffic such as bittorrent and the like, use a disproportionate amount of bandwidth and are unfair to other attendees. Please refrain from non-conference related peer-to-peer activities to minimize this effect.
b) Please be considerate and share the bandwidth with your fellow attendees. Downloading all of the videos from a video sharing service for example, is being a hog.
c) Please do not actively scan the network. Many of the tools for scanning an address range are too efficient at using as much bandwidth as possible, this will likely be noticed.
Despite this AUP, I can confidently predict that speakers will demand unrestricted use of wireless spectrum.
Slight disconnect, eh?
UPDATE: Om of GigaOm reports that AT&T is addressing the problems in Austin by switching on the 850 MHz band in their downtown Austin towers:
AT&T’s network choked and suddenly everyone was up in arms. And then Ma Bell got in touch with Stacey, who reported that AT&T was boosting its network capacity.
How did they do this? By switching on 850 MHz band on eight cell towers to blanket the downtown Austin area. This was in addition to the existing capacity on the 900 MHz band. AT&T is going to make the same arrangements in San Francisco and New York by end of 2009, AT&T Mobility CEO Ralph de la Vega told Engadget.
Not all of your AT&T devices support the 850 MHz band, but the Bold does. The larger takeaway, however, is that all wireless systems become victim to their own success. The more people use them, the worse they get. C’est la vie.