Spectrum 2.0 panel from eComm

Courtesy of James Duncan Davidson, here’s a snap from the Spectrum 2.0 panel at eComm09.

Maura Corbett, Rick Whitt, Peter Ecclesine, Darrin Mylet, and Richard Bennett at eComm
Maura Corbett, Rick Whitt, Peter Ecclesine, Darrin Mylet, and Richard Bennett at eComm

The general discussion was about the lessons learned from light licensing of wireless spectrum in the US, on the success of Wi-Fi and the failure of UWB, and what we can realistically hope to gain from the White Spaces licensing regime. As a person with a foot in both camps – technical and regulatory – it was an interesting exercise in the contrast in the ways that engineers and policy people deal with these issues. In general, hard-core RF engineer Peter Ecclesine and I were the most pessimistic about White Space futures, while the policy folks still see the FCC’s Report and Order as a victory.

In lobbying, you frequently run into circumstances where the bill you’re trying to pass becomes so heavily encumbered with amendments that it’s not worth passing. Rather than get your policy vehicle adopted in a crippled form, it’s better in such circumstances to take it off the table and work with the decision-makers to revive it in a future session without the shackles. While this is a judgment call – sometimes you go ahead and take the victory hoping to fix it later – it’s dangerous to pass crippled bills in a tit-for-tat system because you’re conceding a win in the next round to the other side.

I suggested that the FCC’s order was so badly flawed that the best thing for White Space Liberation would be to have the court void the order and the FCC to start over. This message wasn’t well-received by Rick Whitt, but I had the feeling Peter is on board with it.

The problem with the White Spaces is that the FCC couldn’t make up its mind whether these bands are best used for home networking or for a Third (or is it fourth or fifth?) pipe. The power limits (40 milliwatts to 1 watt) doom it to home networking use only, which simply leads to more fragmentation in the home net market and no additional WAN pipes. That’s not the outcome the champions of open networks wanted, but it’s what they got.

eComm, incidentally, is a terrific conference. The focus is very much on the applications people are developing for mobile phones, and it’s essential for people like me who build networks to see what people want to do with them, especially the things they can’t do very well today. Lee Dryburgh did a fantastic job of organization and selecting speakers, and is to be congratulated for putting on such a stellar meeting of the minds.

Storm not winning any raves

Om Malik isn’t impressed by the BlackBerry Storm and neither am I:

The Storm reminds me of the St. Louis Cardinals phenom Rich Ankiel, who was an awesome pitcher till he flamed out, got hurt and came back as an outfielder and a hitter. He scored a lot of runs last seasons, but he isn’t a center fielder like Mickey Mantle. He is just another player. Storm will be that — just another touch-screen smartphone.

He points out that Blackberry excels at text, which is merely adequate on a touch screen. The omission of Wi-Fi makes the Storm unacceptable for me, so I reluctantly got a G1 to replace my lost Blackberry Curve, and I’m not exactly Google’s biggest fan (see next post.)

Ankiel’s OPS, .843, ranks 78th in the National League, BTW, which is the definition of mediocre.

The Trouble with White Spaces

Like several other engineers, I’m disturbed by the white spaces debate. The White Space Coalition, and its para-technical boosters, argue something like this: “The NAB is a tiger, therefore the White Spaces must be unlicensed.” And they go on to offer the comparison with Wi-Fi and Bluetooth, arguing as Tom Evslin does on CircleID today that “If we got a lot of innovation from just a little unlicensed spectrum, it’s reasonable to assume that we’ll get a lot more innovation if there’s a lot more [unlicensed] spectrum available.”

According to this argument, Wi-Fi has been an unqualified success in every dimension. People who make this argument haven’t worked with Wi-Fi or Bluetooth systems in a serious way, or they would be aware that there are in fact problems, serious problems, with Wi-Fi deployments.

For one thing, Wi-Fi systems are affected by sources of interference they can’t detect directly, such as FM Baby Monitors, cordless phones, and wireless security cameras. Running Wi-Fi on the same channel as one of these devices causes extremely high error rates. If 2.4 and 5.x GHz devices were required to emit a universally detectable frame preamble much of this nonsense could be avoided.

And for another, we have the problem of newer Wi-Fi devices producing frames that aren’t detectable by older (esp. 802.11 and 802.11b gear) without an overhead frame that reduces throughput substantially. If we could declare anything older than 802.11a and .11g illegal, we could use the spectrum we have much more efficiently.

For another, we don’t have enough adjacent channel spectrum to use the newest version of Wi-Fi, 40 MHz 802.11n, effectively in the 2.4 GHz band. Speed inevitably depends on channel width, and the white spaces offer little dribs and drabs of spectrum all over the place, much of it in non-adjacent frequencies.

But most importantly, Wi-Fi is the victim of its own success. As more people use Wi-Fi, we have share the limited number of channels across more Access Points, and they are not required to share channel space with each other in a particularly efficient way. We can certainly expect a lot of collisions, and therefore packet loss, from any uncoordinated channel access scheme, as Wi-Fi is, on a large geographic scale. This is the old “tragedy of the commons” scenario.

The problem of deploying wireless broadband is mainly a tradeoff of propagation, population, and bandwidth. The larger the population your signal covers, the greater the bandwidth needs to be in order to provide good performance. The nice thing about Wi-Fi is its limited propagation, because it permits extensive channel re-use without collisions. if the Wi-Fi signal in your neighbor’s house propagated twice as far, it has four times as many chances to collide with other users. So high power and great propagation isn’t an unmitigated good.

The advantage of licensing is that the license holder can apply authoritarian rules that ensure the spectrum is used efficiently. The disadvantage is that the license holder can over-charge for the use of such tightly-managed spectrum, and needs to in order to pay off the cost of his license.

The FCC needs to move into the 21st century and develop some digital rules for the use of unlicensed or lightly-licensed spectrum. The experiment I want to see concerns the development of these modern rules. We don’t need another Wi-Fi, we know how it worked out.

So let’s don’t squander the White Spaces opportunity with another knee-jerk response to the spectre of capitalism. I fully believe that people like Evslin, the White Space Coalition, and Susan Crawford are sincere in their belief that unlicensed White Spaces would be a boon to democracy, it’s just that their technical grasp of the subject matter is insufficient for their beliefs to amount to serious policy.

Google open-sources Android

I lost my Blackberry Curve somewhere in England last week, so I ordered an HTC G1 from T-Mobile as a replacement. The Curve doesn’t do 3G, so it’s an obsolete product at this point. And as I’m already a T-Mobile customer (I chose them for the Wi-Fi capability of their Curves,) the path of least resistance to 3G goes through the G1. Just yesterday I was explaining to somebody that Android wasn’t really open source, but Google was apparently listening and decided to make a liar of me by open-sourcing Android:

With the availability of Android to the open-source community, consumers will soon start to see more applications like location-based travel tools, games and social networking offerings being made available to them directly; cheaper and faster phones at lower costs; and a better mobile web experience through 3G networks with richer screens.The easy access to the mobile platform will not only allow handset makers to download the code, but to build devices around it. Those not looking to build a device from scratch will be able to take the code and modify it to give their devices more of a unique flavor.

“Now OEMs and ODMs who are interested in building Android-based handsets can do so without our involvement,” Rich Miner, Google’s group manager for mobile platforms, told us earlier today. Some of these equipment makers are going to expand the role of Android beyond handsets.

This is good news, of course. I haven’t enjoyed the fact that T-Mobile sat between me and RIM for Blackberry software upgrades. The first add-on app that I’d like to see for the G1 is something to allow tethering a laptop to 3G via Bluetooth. I could tether the Curve, but as it only supports Edge it wasn’t incredibly useful.

In a more perfect world, I’d prefer the Treo Pro over the G1, but it doesn’t work on T-Mobile’s crazy array of AWS and normal frequencies, and is also not subsidized, so the G1 is a better deal. The Blackberry Storm is probably a better overall device than the G1, but it’s exclusive to Verizon so I would have had to pay a $200 early termination fee to get it. These phones are mainly for fun, so paying a fee to leave a carrier I basically like makes it all too serious.

Ultra-cool Computers

My next personal computer is going to be an ultra-portable tablet. I’ve never bought a laptop of my own, since my employers tend to shower me with them, and they’ve had so many drawbacks I couldn’t see any point in shelling out for one of my own. But recent research shows that we’re officially in the Dynabook Era with great gear like the Dell Latitude XT Tablet, the Lenovo X200 Tablet, the Asus R1E, Fujitsu LifeBook T5010, and the recently-announced HP Elitebook 2730p

What these babies have in common is light weight, sharp but small screens, long battery life, a wealth of connectivity features, and other goodies like web cams and mikes, GPS locators, touch-sensitive displays, and handwriting recognition. They’re more like Smartphones than traditional PCs, but without all the annoying limitations that make Blackberries better in the demo than in real life. Unlike pure slate computers that lack keyboards, they have swivel-mounted screens that can be twisted and folded to cover the laptop’s clamshell base, so you have a touch-sensitive display for when you need to jot notes or draw, and a regular keyboard for high-volume typing.

Each excels in some areas. The Dell seems to have the clearest screen and the best handwriting recognition since it uses a capacitive touchscreen. It draws a bit more power, since capacitive touch keeps an electric field active across the screen, where the more common resistive touch relies on a magnetic stylus to alert the touch sensor that something’s happening. The stylus-activated system rules out using your finger as a pointing device, which is also unfortunate, and has a thicker overlay on the screen than the Dell. The iPhone uses a capacitive touch system.

Dell also has a nice graphics chip with some dedicated memory which signficantly outperforms the shared-memory systems that are commonplace. But Dell’s CPU is at the low end of the scale, and the 1.2 GHz Intel U7600, an ultra-low voltage 65nm dual-core CPU, is as good as it gets. This is apparently a soldered-in part that can’t be upgraded. Dell is also super-expensive.

The Lenovo is too new for much in the way of evaluation, but it has very nice specs and a great pedigree. While the XT Tablet is Dell’s first convertible, the X200 is Lenovo’s third or so, and the details show. If they would only stop white-listing their own wireless cards in the BIOS they’d be at the top of my list. X200 Tablet uses a more substantial and higher power Intel CPU, around 1.8 GHz, which makes is considerably faster than* the Dell. They also use Intel’s Centrino graphics, and suffer a bit for it, but that’s a classic engineering tradeoff. Lenovo has an amazing array of connectivity choices, including the UWB system AKA Wireless USB. With an internal Wireless WAN card with GPS, internal Wi-Fi (including 3×3 11n,) Bluetooth, and Wireless USB, this system has five kinds of wireless without a visible antenna, awfully sharp.

The Fujitsu and Asus convertibles have larger screens – 13.3 in. vs. 12.1 for the Dell and the Lenovo – and add a pound or so of weight. Asus is concentrating on their netbooks these days, and doesn’t seem to be serious about keeping up to date, while the Fujitsu makes some strange choices with noisy fans and heat.

To be avoided are the older HP’s using the AMD chipset. AMD can’t keep up with Intel on power efficiency, so convertible systems that use their parts are only portable between one wall socket and another.

None of these little Dynabooks has made me swipe a card yet, but the collections of technology they represent say a lot about the future of networking. With all that wireless, the obligatory Gigabit Ethernet looks like an afterthought.

Which brings me to my point, gentle readers. What’s your experience with Wireless WANs in terms of service – between AT&T, Sprint, and Verizon, who’s got it going on? I get my cell phone service from friendly old T-Mobile, but they’re not player in the 3G world. I like Verizon’s tiered pricing, as I doubt I’ll use 5GB/mo of random wireless, as close as I tend to be to Wi-Fi hotspots, but it seems like a much nicer fall-back than using my Blackberry Curve as a modem.

For a nice demonstration of the XT’s capacitive touch screen in comparison to the more primitive Lenovo, see Gotta Be Mobile.

*Edited. The X200 non-tablet has a faster processor than the X200 Tablet. The tablet sucks power out of the system, and Lenovo had to de-tune the CPU to provide it.

World’s Largest 802.11n Network

Trapeze Networks finally has announced their deal with U. of Minnesota to build the world’s largest 802.11n network:

PLEASANTON, Calif., March 10, 2008 – Trapeze Networks®, the award-winning provider of Smart Mobile™ wireless solutions, today announced that the University of Minnesota plans to deploy its Smart Mobile™ 802.11n wireless network product suite campus-wide, marking the largest ever 802.11n deployment to date. Beginning in May and continuing over the next five years, approximately 9,500 access points (APs) will be deployed to serve more than 80,000 people across the university’s two campuses. Students, faculty and staff will have fast and secure wireless access wherever and whenever they want it.

This network features a lot of the code I wrote for Trapeze for 802.11n, 802.11e, and bandwidth management, so I hope Trapeze hasn’t screwed it up too badly in the weeks since I left that company for my current gig.

Roaming the afterlife

A dead Malaysian ran up a $218 trillion cell phone bill and people are mystified:

A Malaysian man who paid off a $23 wireless bill and disconnected his late father’s cell phone back in January has been stiffed for subsequent charges on the closed account, MSNBC has reported. Telekom Malaysia sent Yahaya Wahab a bill for 806,400,000,000,000.01 ringgit, or about $218 trillion, for charges to the account, along with a demand from the company’s debt collection agency that he settle the alleged debt within 10 days, or get a lawyer.

It’s actually very simple. Dead people can communicate with the living through the simple mechanism of Electronic Voice Phenomena, documented in the movie White Noise, by leaving recored messages. They’ve apparently figured out that cell phones are way cooler than voice recorders, and they’ve all been having a ball calling living friends and relatives and shooting the breeze. As these calls come from an area with exceptionally high roaming charges, the bill seems high, by living human standards. Which is just another example of what a limited perspective we have on stuff.

White Space Faux Pas

The great white space coalition’s submissions to the FCC are a big bust:

A group of companies including Microsoft and Google had hoped to convince regulators that some new devices could carry high-speed Internet connections over television airwaves without interfering with broadcast signals.

But it didn’t work as planned, according to a report released this week by the Federal Communications Commission. After four months of testing, the agency concluded that the devices either interfered with TV signals or could not detect them in order to skirt them.

Why am I not surprised?

See more discussion by free marketeer Jerry Brito at TLF and by consumer warrior Harry Feld at Public Knowledge.

John Kneuer on Spectrum Policy and Network Neutrality

Doc Searls asked an interesting question to John Kneuer at SuperNova:

What were the rate terms and conditions for WiFi, and what would have happened if those channels were auctioned?

and then David Isenberg chimed in:

Wi-Fi isn’t, wasn’t auctioned. It isn’t owned by any company any carrier, yet I think that everybody in this room, most people in this room, and perhaps yourself, would agree that Wi-Fi is the most innovative section of the spectrum. So there’s no market. Why isn’t that the model instead of auctions?

This was on account of Kneuer talking up the auction of spectrum in the 700 MHz range. People in the audience cheered Doc for asking the question. What does that say about the audience?

On its face, it’s not a sensible question. The apparent belief among the SuperNova crowd is that WiFi is more or less equivalent to high-power 700 MHz, so it can be handled by the regulators the same way. What they’re missing, of course, is that unlicensed WiFi doesn’t need to be auctioned because its low power and large channel count (in the 11a range) permit multiple parties to use it without interfering with each other. And these characteristics limit propagation to 300-1000 feet for most applications.

Would anybody build a region-wide network with towers on every block? Clearly not, so WiFi is a non-starter when it comes to providing competition to wireline broadband providers. If 700 MHz were regulated like WiFi, with low power and no license to operate, it would also be a non-starter in the last mile broadband business.

Humorously, the folks who argue for unlicensed wireless also complain about the lack of competition in broadband.

If they had their way, they wouldn’t have their way.

In related news, the FTC says net neutrality is not necessary:

The Federal Trade Commission today dealt a serious blow to “Net Neutrality” proponents as it issued a report dismissive of claims that the government needs to get involved in preserving the fairness of networks in the United States.

The half-life of political Kool-Aid is apparently about twelve months.

Internet over TV, maybe

There seems to be a huge disconnect on the nature of the magic box proposed to the FCC by the Usual Suspects to reclaim whitespace abandoned by analog TV:

A coalition of big technology companies wants to bring high-speed Internet access to consumers in a new way: over television airwaves. Key to the project is whether a device scheduled to be delivered to federal labs today lives up to its promise.

The coalition, which includes Microsoft and Google, wants regulators to allow idle TV channels, known as white space, to be used to beam the Internet into homes and offices. But the Federal Communications Commission first must be convinced that such traffic would not bleed outside its designated channels and interfere with existing broadcasts.

The six partners — Microsoft, Google, Dell, Hewlett-Packard, Intel and Philips — say they can meet that challenge. Today, they plan to give FCC officials a prototype device, built by Microsoft, that will undergo months of testing.

Is it a low-power, in-home system comparable to WiFi and UWB, or is it a high-power, long-distance service comparable to WiMax? Nobody seems to know, yet that’s critical to evaluating its usefulness. Anybody who knows, please clue me in.