DTV Transition Starts, World Doesn’t End

Contrary to the expectations of Congress and the FCC, the first phase of the DTV transition took place without major incident. Some 23% of American TV stations stopped sending out analog signals Tuesday at midnight, and only 28,000 calls came into the centers the FCC and the cable and satellite providers have established for transition help. The biggest category of call, close to half of all calls, was from people unable to pick up the digital broadcasts at all, or picking them up with very poor quality. A significant number didn’t know how to setup their converter boxes, or didn’t realize that the converter boxes have to scan for channels.

These numbers support a suspicion I’ve had for a while now, that the emphasis on converter boxes is misplaced. The problem that most people are going to have is a complete inability to receive digital broadcasts at all, because they don’t have the right kind of antenna, the antenna isn’t oriented properly, or because they live in the wrong place. Many stations are moving transmitter locations to alter service areas, and won’t be serving some traditional customers any more. Others are reducing power, sometimes quite substantially. Digital broadcasts are more robust, so some reduction in power is quite sensible. But I suspect that over-the-air delivery of TV is such a small percentage of the overall market – well below 20%, and in some areas less than 10% – that it doesn’t make financial sense for stations to invest heavily in high power transmitters.

The timing of the transition was very bad for this reason. A substantial number of OTA TV viewers are doing to need upgrades to roof-mounted antennas, and in many cases they’re going to need multiple antennas pointing in different directions. Getting up on a roof in February is not a pleasant experience in much of America, so a May or June transition date would have been much more sensible. In any event, it’s a good time to buy stock in antenna companies.

I’ve been doing some experiments with roof-mounted antennas that I’ll be reporting on shortly. So far, I can only get 5 stations where I live, and four broadcast in Spanish. Perhaps the FCC needs a budget for bilingual education as well as for converter boxes and antennas.

Nice Outings

My talk at the Messaging Anti-Abuse Working Group went very well. It was a huge room, seating probably 500 or so, and over half-full. I talked about how some of the crazier ideas about net neutrality are potentially becoming mainstream thanks to the politics in the nation’s capitol and some of the personnel choices made by the Obama Administration. The selection of Susan Crawford for the FCC Transition Team is a cause for alarm. Susan is as nice a person as you’ll ever want to meet, and quite bright and well-intentioned, but her position that ISPs and carriers have no business actively managing packets is poison. I got a healthy round of applause, and several people thanked me for my remarks afterwards. Very few people know how dependent e-mail is on the DNS Blacklists that members of this organization maintain, and that’s a real shame.

Last night I took the short trip up to Mountain View to see Jeff Jarvis’s talk about his book What Would Google Do? The audience, about 25 people more or less, was a lot less impressed with Google than Jeff is, and it occurred to me that Google really is vulnerable on the search front. I can imagine a much more effective search methodology than the one Google employs, but getting the venture capital to build a rival infrastructure isn’t going to happen.

I told Jeff (an old friend of the blog who’s driven a lot of traffic this way over the years) that what he likes about Google isn’t Google as much as it’s inherent qualities of the Internet. He more or less knows that, but the packaging of open networks, distributed computing, and free expression is easier when you concretize it, and that’s what his book does. I read it as a sequel to Cluetrain.

Speaking at MAAWG in Frisco tomorrow

I’m on a panel tomorrow at the General Meeting of the Messaging Anti-Abuse Working Group, the organization that keeps the Internet from being overrun by spam and malware:

The Messaging Anti-Abuse Working Group is a global organization focusing on preserving electronic messaging from online exploits and abuse with the goal of enhancing user trust and confidence, while ensuring the deliverability of legitimate messages. With a broad base of Internet Service Providers (ISPs) and network operators representing almost one billion mailboxes, key technology providers and senders, MAAWG works to address messaging abuse by focusing on technology, industry collaboration and public policy initiatives

My panel is on Mail Filtering Transparency: The Impact of Network
Neutrality on Combating Abuse:

Network Neutrality (NN) means different things to different people. In 2008, much of the debate was focused on protecting P2P applications from various network management practices. In 2009, the debate is likely to expand to explore the impact of NN concepts on other applications, particularly email. We have already seen the strong reaction by some parties at the IETF to attempts to standardize DNS xBLs, which some claimed were discriminatory and lacking in transparency. We have also heard of claims that when ISPs block certain domains and servers that this may be discriminatory and could run afoul of NN concepts. This panel will explore the question of what NN means to email anti‐abuse, the increasing scrutiny that anti‐abuse policies will be under, the motivations behind the drive for greater transparency regarding such policies, and how all of those things should be balanced against the need to enforce strong anti‐abuse techniques.

Dave Crocker is on the panel, and I’m looking forward to meeting him, and I have it on good authority that Paul Vixie will be in attendance as well. The best thing about being an opinionated jerk like I am is the people you get to meet.

This organization is at the crossroads of “run any application you want” and “reasonable network management.” Spam prevention has always been a lightning rod because the very existence of spam highlights so many of the problems the current Internet architecture has. Its central assumption is that people will behave nicely all (or at least most) of the time, and the existence of botnets clearly calls that into question. It probably comes as no surprise that the filtering that spam reduction systems have to do makes net neuts nervous. Stupid networks may be nice in theory, but we live in a world of practice.

Court protects the right to bluff

In a rare move, the DC Circuit has upheld an FCC decision

The cable industry has won a big legal victory in the fiercely competitive phone services market. An appeals court has supported the Federal Communications Commission in its ruling that phone carriers—in this case Verizon—can’t try to lure back customers after they’ve initiated a service switch but before their number has been transferred.

The FCC rarely prevails in court, of course, so this may be a sign that we’re living in the End Times. But we can take some comfort from the fact that it wasn’t totally unpredictable, given that Kevin Martin was on the losing side.

The case involved Verizon’s efforts to win back customers when notified by the new carrier that they had to release the phone number. Verizon took this as an occasion to offer sweeter deals, which the court ruled an unlawful violation of the customer’s privacy, despite the fact that Google’s entire business is based on this kind of snooping.

It’s a win for consumers because it preserves the right to bluff. In today’s economy, consumers can frequently get better deals on subscription services merely by threatening to cancel, whether we’re serious or not. As it happens, I got lower prices from Sports Illustrated and Illy Coffee by calling up to cancel my subscriptions, and in both cases they were substantial. DirecTV refused to offer me a sweetner last year when I was tired of their crappy DVR, so they lost my TV business to Comcast. It’s not entirely clear to the business whether any of these threats are serious, of course, so it’s in their interest to err on the side of caution and offer the customer a better deal when they have the chance. Efforts to win back a customer who’s already made a switch have to be harder to pull off.

But the Verizon deal stacked the cards a little too far in the company’s favor, because it allowed them to play hardball until it was absolutely clear that the customer wasn’t bluffing. They only get a switchover for phone service when you’ve made a deal and scheduled a hookup date.

No deal, we all have the right to bluff and the company is going to have to guess just like any other poker player. That’s a good deal for the consumer.

Digital Britain and Hokey Tools

It’s helpful to see how other countries deal with the typically over-excited accusations of our colleagues regarding ISP management practices. Case in point is the Digital Britain Interim Report from the UK’s Department for Culture, Media and Sport and Department for Business, Enterprise and Regulatory Reform, which says (p. 27):

Internet Service Providers can take action to manage the flow of data – the traffic – on their networks to retain levels of service to users or for other reasons. The concept of so-called ‘net neutrality’, requires those managing a network to refrain from taking action to manage traffic on that network. It also prevents giving to the delivery of any one service preference over the delivery of others. Net neutrality is sometimes cited by various parties in defence of internet freedom, innovation and consumer choice. The debate over possible legislation in pursuit of this goal has been stronger in the US than in the UK. Ofcom has in the past acknowledged the claims in the debate but have also acknowledged that ISPs might in future wish to offer guaranteed service levels to content providers in exchange for increased fees. In turn this could lead to differentiation of offers and promote investment in higher-speed access networks. Net neutrality regulation might prevent this sort of innovation.

Ofcom has stated that provided consumers are properly informed, such new business models could be an important part of the investment case for Next Generation Access, provided consumers are properly informed.

On the same basis, the Government has yet to see a case for legislation in favour of net neutrality. In consequence, unless Ofcom find network operators or ISPs to have Significant Market Power and justify intervention on competition grounds, traffic management will not be prevented.

(Ofcom is the UK’s FCC). Net neutrality is, in essence, a movement driven by fears of hypothetical harm that might be visited upon the Internet given a highly unlikely set of circumstances. Given the fact that 1.4 billion people use the Internet every day, and the actual instances of harmful discrimination by ISPs can be counted on one hand (and pales in comparison to harm caused by malicious software and deliberate bandwidth hogging in any case,) Ofcom’s stance is the only one that makes any sense: keep an eye on things, and don’t act without provocation. This position would have kept us out of Iraq, BTW.

Yet we have lawmakers in the US drafting bills full of nebulous language and undefined terms aimed at stemming this invisible menace.

Are Americans that much less educated than Brits, or are we just stupid? In fact, we have a net neutrality movement in the US simply because we have some well-funded interests manipulating a gullible public and a system of government that responds to emotion.

A good example of these forces at work is the freshly released suite of network test tools on some of Google’s servers. Measurement Lab checks how quickly interested users can reach Google’s complex in Mountain View, breaking down the process into hops. As far as I can tell, this is essentially a dolled-up version of the Unix “traceroute” which speculates about link congestion and takes a very long time to run.

The speed, latency, and consistency of access to Google is certainly an important part of the Internet experience, but it’s hardly definitive regarding who’s doing what to whom. But the tech press loves this sort of thing because it’s just mysterious enough in its operation to invite speculation and sweeping enough in its conclusions to get users excited. It’s early days for Measurement Lab, but I don’t have high expectations for its validity.

Doubts about Broadband Stimulus

The New York Times has a front page story today on the broadband stimulus bill which features an extensive quote from Brett:

Critics like Mr. Glass say the legislation being developed in Congress is flawed in various ways that could mean much of the money is wasted, or potentially not spent at all — arguably just as bad an outcome given that the most immediate goal of the stimulus measure is to pump new spending into the economy.

An “open access” requirement in the bill might discourage some companies from applying for grants because any investments in broadband infrastructure could benefit competitors who would gain access to the network down the line.

Meeting minimum speed requirements set forth in the House version could force overly costly investments by essentially providing Cadillac service where an economy car would be just as useful. And some worry that government may pay for technology that will be obsolete even before the work is completed.

“Really the devil is in the details,” Mr. Glass said. “Yes, there is $9 billion worth of good that we can do, but the bill doesn’t target the funds toward those needs.”

The bill is still very rough. Some critics cite the bill’s preference for grants to large incumbents, others highlight the amorphous “open access” provisions and the arbitrary speed provisions as weaknesses. The only interest groups that appear altogether happy with it are Google’s boosters, such as Ben Scott of Free Press. This is a flip-flop for Free Press, who only last week was urging members to call Congress and ask that bill be killed.

A particularly odd reaction comes from friend of the blog Jeff Jarvis, who took time out from pitching his love letter to Google What Would Google Do? to tear into the article’s sourcing:

I found myself irritated by today’s story in the New York Times that asks whether putting money from the bailout toward broadband would be a waste. The question was its own answer. So was the placement of the story atop page one. The reporter creates generic groups of experts to say what the he wants to say (I know the trick; I used to be a reporter): “But experts warn…. Other critics say…. Other supporters said…”

I wish that every time he did that, the words “experts,” “critics,” and “supporters” were hyperlinked to a page that listed three of each.

It’s an obvious case of a story with an agenda: ‘I’m going to set out to poke a hole in this.’

The odd bit is that five people are named and quoted, and the terms “expert” and “critic” clearly refer to these named sources. It’s boring to repeat names over and over, so the writer simply uses these terms to avoid the tedium. It’s clear that Brett and Craig Settles are the critics and experts. Jeff seems not to have read the article carefully and simply goes off on his defensive tirade without any basis.

It’s a given in Google’s world that massive government subsidies for broadband are a good thing because they will inevitably lead to more searches, more ad sales, and more revenue for the Big G. But while that’s clearly the case, it doesn’t automatically follow that what’s good for Google is good for America, so it behooves our policy makers to ensure that the money is spent wisely, without too many gimmicks in favor of one technology over another or too many strings attached that don’t benefit the average citizen.

Raising questions about pending legislation and trying to improve it is as American as baseball, and the article in the Times is a step in the right direction. It may not be what Google would do, but it’s good journalism.

I want to make sure that the broadband money is spent efficiently, so I would bag the open access requirement (nobody knows what it means anyway) and give credit all improvements in infrastructure that increase speed and reduce latency.

The bill needs to support all technologies that have utility in the Internet access space, wireless, coax, and fiber, but should encourage the laying of new fiber where it’s appropriate, and high-speed wireless in less-populated areas. Eventually, homes and businesses are pretty much all going to have fiber at the doorstep, but that doesn’t need to happen overnight.

Welcome Brett Glass

The following post is from our new co-blogger, Brett Glass. Brett and I first crossed paths when we were working on the “Skywalker” token-ring project at Texas Instruments in the early 80s. Brett was part of the team in Houston doing the chipset, and I worked on a team on Austin doing a terminal server application for it. We both spoke at an ITIF event in Washington, DC, last spring on network management. He’s been a valuable commenter here for a while, and I’m very happy to have him contributing posts as well. Here’s his bio: Continue reading “Welcome Brett Glass”

Professional Complainers Blast Cox

Cox Cable announced plans to test a new traffic management system intended to improve the Internet experience of most of their customers yesterday, and the reaction from the network neutrality lobby came fast and furious. The system will separate latency-sensitive traffic from bulk data transfers and adjust priorities appropriately, which is the sort of thing that Internet fans should cheer. In its essence, the Internet is a resource contention system that should, in most cases, resolve competing demands for bandwidth in favor of customer perception and experience. When I testified at the FCC’s first hearing on network management practices last February, I spent half my time on this point and all other witnesses agreed with me: applications have diverse needs, and the network should do its best to meet all of them. That’s what we expect from a “multi-purpose network”, after all.

So now that Cox wants to raise the priority of VoIP and gaming traffic over background file transfers, everybody should be happy. The neutralists have always said in public fora that they support boosting VoIP’s priority over P2P, and Kevin Martin’s press release about the Comcast order said he was OK with special treatment for VoIP. And in fact the failure of the new Comcast system to provide such special treatment is at the root of the FCC’s recent investigation of Comcast, which was praised by the neuts.

So how is it that the very people who complain about Comcast’s failure to boost VoIP priority are now complaining about Cox? Free Press’s general-purpose gadfly Ben Scott is practically jumping up and down pounding the table over it:

Consumer advocates certainly aren’t impressed. “The information provided by Cox gives little indication about how its new practices will impact Internet users, or if they comply with the FCC’s Internet Policy Statement,” says consumer advocacy firm Free Press in a statement. “As a general rule, we’re concerned about any cable or phone company picking winners and losers online.”

“Picking winners and losers” is bad, and failing to pick winners and losers is also bad. The only thread of consistency in the complaints against cable, DSL, and FTTH providers is a lack of consistency.

Make up your mind, Ben Scott, do you want an Internet in which Vuze can step all over Skype or don’t you?

UPDATE: For a little back-and-forth, see Cade Metz’ article on this The Register, the world’s finest tech site. Cade quotes EFF’s Peter Eckersley to the effect that Cox is “presuming to know what users want.” They are, but it’s not that hard to figure out that VoIP users want good-quality phone calls: a three-year-old knows that much.

Technorati Tags:

What recession?

So here’s your recession-proof business, ladies and gentlemen:

Netflix, the company which mails out DVD rentals and also offers streamed programming via the internet, saw a 45% jump in profits and 26% rise in consumers to 9.4 million in the fourth quarter.

This was the quarter in which Netflix released Watch Instantly on non-PC platforms. It’s so ubiquitous now I have it on three platforms: a home theater PC, TivoHD, and a Samsung BD-P2500 Blu-Ray player. It looks best on the Samsung, thanks to its HQV video enhancement chip.

Internet Myths

Among my missions in this life is the chore of explaining networking in general and the Internet in particular to policy makers and other citizens who don’t build network technology for a living. This is enjoyable because it combines so many of the things that make me feel good: gadgetry, technology, public policy, writing, talking, and education. It’s not easy, of course, because there are a lot of things to know and many ways to frame the issues. But it’s possible to simplify the subject matter in a way that doesn’t do too much violence to the truth.

As I see it, the Internet is different from the other networks that we’re accustomed to in a couple of important ways: for one, it allows a machine to connect simultaneously to a number of other machines. This is useful for web surfing, because it makes it possible to build a web page that draws information from other sources. So a blog can reference pictures, video streams, and even text from around the Internet and put it in one place where it can be updated in more-or-less real time. It enables aggregation, in other words. Another thing that’s unique about the Internet is that the underlying transport system can deliver information at very high speed for short periods of time. The connection between a machine and the Internet’s infrastructure is idle most of the time, but when it’s active it can get its information transferred very, very quickly. This is a big contrast to the telephone network, where information is constrained by call setup delays and a very narrow pipe.
Continue reading “Internet Myths”