No good deed goes unpunished

Ayann Hirsi Ali is a hero to many who want to see the war on Islamic terrorism end in a victory for the West. But every hero has her critics, and Ali is no exception. For some of the most mind-bending perverse logic you’ve ever seen, check the Newsweek article attacking Ali from the pro-Muslim and pro-feminist (!) point of view:

Other Muslim women interested in reform aren’t exactly in step with Hirsi Ali. “I wish people had been nicer to her,” says Muslim author and feminist Asra Nomani. “But I don’t blame Islam. I blame really messed-up people who’ve used religion to justify their misogyny.” As staunchly patriarchal strains of Wahhabi Islam infiltrate Muslim cultures outside the gulf region, many modern female followers are wondering how to embrace their religion without succumbing to its more sexist demands. And they’re coming up with answers that don’t require them to abandon either their religion or their culture. In the Middle East and South Asia, a strong majority of Muslim women recently polled by Gallup believed they should have the right to work outside the home and serve in the highest levels of government. Here in the United States, dozens of scholars like Ithaca College’s Asma Barlas, Harvard’s Leila Ahmed and Notre Dame’s Asma Afsaruddin have challenged widely accepted interpretations of the Qur’an. “They are Islam’s Martina Luthers,” jokes Nomani. “They are my heroes.”

It’s not clear what “being nicer” would have meant: no clitorectomy, fewer beatings, and a better arranged marriage, or not being disowned? Some people are just so hard to please.

Linklove to Roger L, Simon.

Why the Kyoto Protocol won’t work

China and India are exempt from the international agreement to limit greenhouse gases. With China set to pass U.S. as world’s top generator of greenhouse gases that’s pretty much a joke:

Far more than previously acknowledged, the battle against global warming will be won or lost in China, even more so than in the West, new data show.

A report released last week by Beijing authorities indicated that as its economy continues to expand at a red-hot pace, China is highly likely to overtake the United States this year or in 2008 as the world’s largest emitter of greenhouse gases.

This information, along with data from the International Energy Agency, the Paris-based alliance of oil importing nations, also revealed that China’s greenhouse gas emissions have recently been growing by a total amount much greater than that of all industrialized nations put together.

“The magnitude of what’s happening in China threatens to wipe out what’s happening internationally,” said David Fridley, leader of the China Energy Group at Lawrence Berkeley National Laboratory.

Back to the drawing board, fellow global-warming fanatics.

Open Patent Office

This is a promising application of Wiki technology for a knowledgeable group of real people:

The government is about to start opening up the process of reviewing patents to the modern font of wisdom: the Internet.

The Patent and Trademark Office is starting a pilot project that will not only post patent applications on the Web and invite comments but also use a community rating system designed to push the most respected comments to the top of the file, for serious consideration by the agency’s examiners. A first for the federal government, the system resembles the one used by Wikipedia, the popular user-created online encyclopedia.

“For the first time in history, it allows the patent-office examiners to open up their cubicles and get access to a whole world of technical experts,” said David J. Kappos, vice president and assistant general counsel at IBM.

This will be good if and only if the citizen reviewers are expert and accountable, and under those conditions I’m enthusiastically for it, especially since professional reviewers have the last word.

The Wikipedia Scandal Continues

Nick Carr reports on the latest twist in the Wikipedia phony credentials scandal:

Head Wikipedian Jimmy Wales, having previously defended the Wikipedian administrator Ryan Jordan, who faked an elaborate online identity – “Essjay” – as a distinguished religion scholar, has this morning asked his beleaguered colleague to resign, saying that his “past support of EssJay in this matter was fully based on a lack of knowledge about what has been going on.”

Seth Finkelstein highlights the core of Jimbo’s belated reaction:

It doesn’t matter that Essjay lied to the New Yorker reporter about his credentials, making Wikipedia look good to the media – a matter Wales has known about for weeks. No mention of the dishonesty of using degree falsification to endorse Wikipedia in a letter to a professor. That’s lying to those outside The Family.

But he used his false credentials in content disputes. That’s serious! It’s an IN-WORLD offense! It’s inside The Family.

It all started with a Wikipedia official lying to the New Yorker:

Essjay was recommended to Ms. Schiff as a source by a member of Wikipedia’s management team because of his respected position within the Wikipedia community. He was willing to describe his work as a Wikipedia administrator but would not identify himself other than by confirming the biographical details that appeared on his user page. At the time of publication, neither we nor Wikipedia knew Essjay’s real name. Essjay’s entire Wikipedia life was conducted with only a user name; anonymity is common for Wikipedia admin-istrators and contributors, and he says that he feared personal retribution from those he had ruled against online. Essjay now says that his real name is Ryan Jordan, that he is twenty-four and holds no advanced degrees, and that he has never taught. He was recently hired by Wikia—a for-profit company affiliated with Wikipedia—as a “community manager”; he continues to hold his Wikipedia positions. He did not answer a message we sent to him; Jimmy Wales, the co-founder of Wikia and of Wikipedia, said of Essjay’s invented persona, “I regard it as a pseudonym and I don’t really have a problem with it.”

Wikipedia fans claim Ryan Jordan is an exception and most of the paid staff and volunteer editors are honest. My experience in Wikipedia editing, including the inevitable content disputes, administrative blocks, and arbitration requests leads me to believe that Ryan Jordans are more the rule than the exception in Wikipedia land. It’s a project that’s built on unpaid, anonymous labor, and the only thing they can possibly be getting out of it is emotional payback (read: a fantasy life.)

Jordan, like countless other Wikipedians, created a persona for himself that represented what he wished himself to be, and he stomped through Wikipedia pretending it was real for so long he became deluded enough to believe it.

Someday I’ll write about what goes on behind the scenes at Wikipedia, and it won’t be pretty.

Teaching the hive mind to discriminate

Writing on his blog, Chicago law professor Cass Sunstein invokes the name of the sainted Hayek to endorse the decentralized nature of Wikipedia and other peer-production exercises:

Developing one of the most important ideas of the 20th century, Nobel Prize-winning economist Friedrich Hayek attacked socialist planning on the grounds that no planner could possibly obtain the “dispersed bits” of information held by individual members of society. Hayek insisted that the knowledge of individuals, taken as a whole, is far greater than that of any commission or board, however diligent and expert. The magic of the system of prices and of economic markets is that they incorporate a great deal of diffuse knowledge.

Sunstein fails to appreciate that markets are a special case in group dynamics, where knowledge is maximally distributed. So I pointed that out to him:

Wikipedia is all about process, and because its process is so different from Britannica’s, it’s not really accurate to describe it as an “encyclopedia”. Wikipedia is actually the world’s largest collection of trivia, gossip, received wisdom, rumor, and innuendo. It’s valuable because any large collection of information is valuable, but not in the same way that the verified, expert summaries in an encyclopedia are valuable.

If it’s true that the “knowledge of individuals, taken as a whole, is far greater than that of any commission or board,” it’s also true that the sum of their prejudice, mistaken beliefs, wishful thinking, and conformance to tradition is greater.

All of this is to say that group endeavors like Wikipedia produce breadth but not depth. For some endeavors depth is important, but for all others it’s fine to consult the rabble.

Marketing, for example, can gain much by mining the dark corners of Wikipedia; engineering and medicine, not so much, as knowledge is not dispersed at the depths as it is at the surface.

Which brings us back to Hayek. Markets do a great job of bringing information about the wishes of buyers to bear on the consciousness of sellers. Everybody who participates in a market is an expert on the subject of his own wishes or his own product. But when you leave the realm of buying and selling, expertise is not as widely dispersed as participation, and then the decentralized model falls down.

And then we’ve got a little back-and-forth with Tim Wu on Tech Lib, where Wu says:

So its obviously true that decentralized and centralized systems are better for different things, as RB points.

One thing I think is interesting, and don’t quite understand, is how often, however, humans tend to underestimate the potential of decentralized solutions

That’s what Hayek was getting at in his paper — there’s no question that if you put a perfect, planned economy next to an unplanned economy, the planned economy will win. Hands down.

But we aren’t good at knowing when information problems will cripple what would have been the better system.

So maybe we’re overcompensating, as RB suggests, in the direction of decentralized systems, but I happen to think we have to fight a perfectionist instinct that drives us too over-centralization

Just ask Napoleon III

Here’s the essential issue, as I see it: It’s undeniably true that information exists nearly everywhere, hence the potential information present in a large group is greater than that in a small group, and that’s why markets allocate resources better than committees. But it’s also true that misinformation exists nearly everywhere, so there’s also a huge potential for large groups to be misguided.

So the real question about information and group scaling is this: are there procedures for separating good information from false information (“discrimination”) that are effective enough to allow groups to be scaled indefinitely without a loss of information quality? It’s an article of faith in the Wikipedia “community” that such procedures exist, and that they’re essentially self-operative. That’s the mythos of “emergence”, that systems, including human systems, automatically self-organize in such a way as to reward good behavior and information and purge bad information. This seems to be based on the underlying assumption that people being basically good, the good will always prevail in any group.

I see no reason to believe that groups have this property, even if one accepts as given the fundamental goodness of the individual. And even if some groups have this property, does it follow that self-selecting groups do? Polling, for example, seems to pretty accurate when it’s done by random sample. But self-selected polling is notoriously inaccurate. If a web site puts up a presidential preference poll and supporters of one candidate or another urge each other to vote, the results are skewed.

This is what happens in Wikipedia and many open source projects: participation is limited to people with an interest in a particular outcome, and they distort the process to get the desired result. Participation is not automatically tailored to align with expertise, as it is in markets.

The methods we have for separating fact from fiction, such as the expert opinion, scientific method and random polling, don’t scale to arbitrarily large groups.

Hence the work of large groups is suspect.

Berners-Lee backpedals on net neutrality

We’ve previously observed that Sir Timmy has taken a very nuanced approach to net neutrality by endorsing the concept but defining it in a way that differs radically from the actual legislation. He continued that approach in a Congressional hearing today, speaking platitudes about a content-neutral Web but refusing to endorse any bill:

Although he has previously voiced support for Net neutrality, Berners-Lee on Thursday stopped short of taking a position on the various bills on that topic proposed in Congress in the past year.

“I can say I feel that a nondiscriminatory Internet is very important for a society based on the World Wide Web,” he said. “I think that the communications medium is so important to society that we have to give it a special treatment.”

Proponents of Net neutrality define the concept as prohibiting network operators, such as Verizon and Comcast, from being allowed to charge content companies like Google and Amazon.com extra fees for prioritization. Rep. Edward Markey (D-Mass.), who arranged the hearing, was among the chief sponsors of a legislative proposal last year that would put that mandate into law.

Perhaps in a nod to the issue’s divisiveness, with Republicans tending to reject the idea of new laws, Markey on Thursday issued a disclaimer to his colleagues. “Before end of year, we’re going to hear from all sides on that issue so that everyone’s perspective is heard,” he said.

What we have here is a man who stumbled into a fight and now wants to get out of the middle of it without offending anyone. He knows that the content of the Markey bill is ridiculous, (and I know that he knows this because I brought it to his attention personally.) But to support peace and freedom is to support net neutrality, so he can’t say that he’s against it.

It’s my personal opinion that Lee took a position without fully understanding it. That probably sounds weird to anybody who doesn’t live packets and breath routes, but the fact is that Sir Tim’s expertise is in a wholly different part of the Internet than the part that’s affected by forwarding priorities, peering arrangements, and packet queues.

He’s an application guy, and his deal is images, fonts, links, document styles, and data types. In fact, the design of his invention, HTTP 1.0, was naive about Internet traffic. It insisted on chunking information up into tiny pieces roughly one third the optimum size for Internet traffic management, and by slowing them down immensely by not using TCP sockets correctly (every object had its own socket, and hence suffered from Slow Start.) No traffic guy would make such a mistake, and the folks who came behind cleaned up the mess. So here’s a guy trying to do the right thing and largely failing because he moved too soon and can’t admit he made a mistake.

Bob Kahn did it the right way: he sat back and listened until he understood what the debate was about, and then came down on the right side of the question, against the new regulations. That’s the kind of guy who invents an Internet.

Many of the Internet’s great heroes have turned out to be one-trick ponies. There are some guys, like Kahn, David Clark, Van Jacobsen, and Jon Postel, who managed to make important contributions year after year. Clark was the main author of the “End to End Arguments in System Design” paper, but he was also one of the main men behind DiffServ, twenty years later. And then you have guys who pop up once with a good idea but never have another one, and that makes me wonder if the idea was really original.

I think the serial innovators are the ones to heed.

Lightspeed ahead

Now TV viewers have a choice of cable providers in a few markets, thanks to the roll-out of the AT&T Lightspeed project, sold as “U-verse:

AT&T’s advanced broadband services – voice, high-speed data and video – are sold under the “U-verse” brand name. The service is currently available in 13 markets in five states. Lightspeed was announced at a splashy press conference in late 2004. At the time, AT&T said it expected to spend $4 billion to $6 billion to make a menu of broadband services available to 18 million homes by the end of 2007.

AT&T started making some revisions to its targets in 2005. One called for Lightspeed to reach 18 million homes by 2008, giving itself a one-year extension on that total. In a recent 10-K filing, AT&T again revised its plan, raising the 2008 goal to 19 million households. In that filing, AT&T says nothing about the original 2007 targets.

The San Antonio-based communications giant has also updated its cost estimate. AT&T now says its spending on Lightspeed from 2006 through 2008 will add up to $4.6 billion. The total expenditure from 2004 through 2008: $5.1 billion.

This offering is the reason AT&T sought nationwide video franchising from Congress last year, only to lose after net neutrality activists twisted the product into a bizarre caricature. But that didn’t slow the phone company down, as states have proved willing to enact statewide video franchising measures that allow deployment as fast as AT&T can deliver it anyway.

So what is it about a second supplier of Triple-Play that’s so threatening to populist Democrats and consumer rights lobbyists? Nothing really, but they’ve been tripped-up by their own rhetoric. This service uses IPTV to deliver TV programming, and the consumer people have made the unfortunate mistake of believing that all network traffic framed in IP is “the Internet”. IPTV is a service that’s confined to a private network, and it never touches the public Internet. That’s annoying to Internet-based companies like Google and Netflix who want to compete with cable TV through these private networks as well, but not so understandable that the U-verse network should be opened up to them for free.

And that was the point that AT&T CEO Ed Whitacre was making when he said Google wouldn’t be using his pipes for free: Internet service, fine; IPTV service, not so fine.

Is that so hard to understand?

Aruba’s nervous breakdown

It’s come to our attention that Aruba Networks, the wireless LAN company that recently filed for an IPO, is terrified by the new architecture of the Trapeze wireless LAN system. To summarize the issues, Trapeze and Aruba both build enterprise-class wireless switches, consisting in both cases of wireless Access Points and back-end Ethernet switches. Both systems present a control point on the Ethernet side, and both switch traffic between the wire and the air.

But the new Trapeze architecture has a wrinkle that makes it much faster, more resilient, and more scalable than the Aruba system: local switching. In the Aruba system, all traffic originating on the air has to go back to a Big Ethernet switch before it can be decrypted and delivered to its final destination. But the Trapeze system, with local switching enabled, makes forwarding decisions at the edge of the wired network, not in a big switch that can become a traffic bottleneck.

Hence the Trapeze system can handle larger numbers of users with lower latency with no loss of management flexibility: you manage it as if it were a Big Fat Switch system, and it right-sizes its forwarding functions according to traffic needs, not the blinders of a mediocre group of system architects.

This has Aruba running scared, so they’re in full FUD mode as the e-mail below indicates. I’ve interspersed the Aruba message with a fisking from Trapeze.

Enjoy.

From: Alan Emory [mailto:[email protected]]
Subject: Trapeze Takes A Step Back – Selling Fat APs

We need to start with the subject of the message. Trapeze actually has taken a big step forward by combining the best of fat and thin APs in a single comprehensive solution. Aruba and Cisco force you to make a choice…one size fits all. Only Trapeze allows you to use the right tool for the right environment. It is very important to note that the customer can run the entire Trapeze system in a completely thin, centralized way if they so choose. Smart Mobile provides more flexibility in case that isn’t the right answer for your environment. Aruba? If the only tool you have is a hammer, everything looks like a nail.
Continue reading “Aruba’s nervous breakdown”

Net Neutrality is a Delusion

Scott Cleland mentions that His Eminence, Sir Timothy Berners-Lee will testify before Congressman Ed Markey’s House subcommittee on Telecommunications, the Internet, and Shameless Pandering to the Conspiracy Nuts Thursday. Markey has an ambitious agenda:

In a wide-ranging conversation yesterday, Markey laid out a broad telecom agenda that could pit him against the telephone and cable companies — expressing interest in “paranoia-inducing alternatives” like municipal broadband projects and wireless carriers that could pose a competitive threat to cable and telephone companies and push them to innovate.

He stressed that network neutrality — an initiative to ensure that the Internet does not become a two-tiered system in which some companies pay fees for priority access –will likely dominate the discussion over the next two years.

Innovations such as the Web browser, search engines, and the Internet did not emerge from large established companies, and forcing firms to pay more to reach users would stifle creativity, he said.

It’s a position that puts him at odds with major industry players.

Primarily, this position puts him at odds with reality. Was the Web browser actually an innovation that didn’t come from a large established company? Well, given the creation of a web by the interconnection of hyperlinks in documents on the Internet, the browser was more a requirement than an innovation, and hyperlinks were actually first implemented in research labs funded by large enterprises, many private and some public. The first web browser that was fully-functional was produced by Microsoft, so that’s one error. The first search engine that was worth spit was Alta Vista, produced by Digital Equipment Corporation, so that’s two. And the Internet itself was produced by contractors working for the biggest enterprise of all, the United States government, so there’s error number three.

And where did the key technologies upon which the Internet was based come from? The transistor, the high speed data link, the modem, the digital modem and the Unix operating system were all produced by researchers at Bell Labs, part of the world’s most evil monopoly, so oops again. And the personal computers that made the Internet necessary were created by IBM and Intel, using technologies developed by Xerox. So where is this yarn of the virtuous little guys innovating like crazy while the dinosaurs slept really coming from? It’s nothing more than a cheap fantasy.

Now I don’t really expect politicians to be historians of technology, and to actually understand the things they regulate. But they do have people on staff who are supposed to keep them from saying stupid things, and it’s abundantly clear that Markey’s aren’t cutting it.

The hearing will be a real knee-slapper if Markey’s people can’t keep his mouth in check, and history suggests they’re bound to fail.

But we’ll see what tomorrow brings.

Antidote to Neutrino Drool

Neutrinos are touting two new drooling videos on the regulations they’re trying pass, one that makes Telcos out to be space aliens and the other that makes them out to be parasites on the networks they’ve built. And they’re getting rave reviews from the confidence men who’ve conjured the net neutrality issue out of thin air and the million morons who’ve been taken in by them. Here’s a little bit of an antidote:

I don’t know who produced it, but it’s sharp. The fundamental question you have to ask any neutrino who claims the Internet is under attack by Telcos who want to censor blogs is simply this: “Where’s the proof? Just because politicians and professional scare artists say something will happen some day doesn’t mean a damn thing.”

The reality is that phone companies want to compete with cable to bring TV Programming into your home. They make money from Internet access as well, and they want to sell that to you too, just as they always have. The only new issue is about TV, not the Internet.

Neutrinos try to make their case using videos and songs because it can’t be made in rational, clear, verifiable prose.

Net neutrality is a con game.