Netroots Legislative Agenda

I like a good fight, no matter who’s fighting. Matt Stoller, the MyDD blogger who’s wasted so many electrons on the dubious cause of net neutrality, wrote a post immediately after the recent election in which he declared that the “netroots” legislative agenda begins and ends with his pet cause. A somewhat more serious thinker, Bob Fertik, quickly listed 140 agenda items and asked his readers to vote on them; his list includes things like raising the minimum wage, signing Kyoto, restoring habeas corpus, and all that sort of trivia. Net neutrality came in at number 14. Here’s the explanation:

Bloggers who work mainly with text and photos (and that’s most political blogs) could blog without net neutrality; it would mainly affect video bloggers since they consume far more bandwidth, and that’s what the monopoly gatekeepers want to tax.

But Bloggers couldn’t do what we do without the First Amendment…

Now that seems awfully sensible, especially for somebody who drinks the Kool-Aid. Why is it that Stoller has such a hard time keeping things in perspective?

What does Tim think?

According to reports from BBC and The Guardian, web inventor Tim Berners-Lee thinks his baby’s in danger. BBC News:

He told the BBC: “If we don’t have the ability to understand the web as it’s now emerging, we will end up with things that are very bad.

“Certain undemocratic things could emerge and misinformation will start spreading over the web.

“Studying these forces and the way they’re affected by the underlying technology is one of the things that we think is really important,” he said.

And The Guardian:

The creator of the world wide web told the Guardian last night that the internet is in danger of being corrupted by fraudsters, liars and cheats. Sir Tim Berners-Lee, the Briton who founded the web in the early 1990s, says that if the internet is left to develop unchecked, “bad phenomena” will erode its usefulness.

His creation has transformed the way millions of people work, do business, and entertain themselves.

But he warns that “there is a great danger that it becomes a place where untruths start to spread more than truths, or it becomes a place which becomes increasingly unfair in some way”. He singles out the rise of blogging as one of the most difficult areas for the continuing development of the web, because of the risks associated with inaccurate, defamatory and uncheckable information.

But Tim says he was misquoted both times, and the web is really in fine shape:

A great example of course is the blogging world. Blogs provide a gently evolving network of pointers of interest. As do FOAF files. I’ve always thought that FOAF could be extended to provide a trust infrastructure for (e..g.) spam filtering and OpenID-style single sign-on and its good to see things happening in that space.

In a recent interview with the Guardian, alas, my attempt to explain this was turned upside down into a “blogging is one of the biggest perils” message. Sigh. I think they took their lead from an unfortunate BBC article, which for some reason stressed concerns about the web rather than excitement, failure modes rather than opportunities. (This happens, because when you launch a Web Science Research Initiative, people ask what the opportunities are and what the dangers are for the future. And some editors are tempted to just edit out the opportunities and headline the fears to get the eyeballs, which is old and boring newspaper practice. We expect better from the Guardian and BBC, generally very reputable sources)

So what’s going on here, was the venerable scientist misquoted by a sensationalist press? I think not, as both BBC and The Guardian are well known for the sobriety of their analysis of technical subjects. At this stage in his career, Berners-Lee is more a politician than a scientist, and he needs to learn the politician’s skill of talking to journalists so they can understand what, if anything, he thinks. He tends to speak out both sides of his mouth, as he’s done on network neutrality. He claims to support the principle while endorsing commercial arrangements that happen to be forbidden by proposed neutrality laws, and that’s hard to dance around.

The web, like any number of things, is a mixture of good and bad, and the challenge is always to maximize the one while minimizing the other. That’s not too hard to express, is it?

Oh joy

The Citizen Journalist meets the Citizen Engineer and soon we’ll be drowning in data:

The new NewAssignment.net site launches today and Tom Evslin writes about a very real networked journalism project to find whether there are the smoking guns of network (non)neutrality lurking in our ISP wires.

We’ve already seen network neutrality discrimination claims made by Craig Newmark that turned out to be caused by the odd configuration of his equipment, discrimination claims that turned out to be temporary service outages, and in Canada discrimination claims that turned out to be service offerings. When the citizen engineer/jour-analyst starts looking at packet delay data, no doubt every traffic-related variation in delivery times will be linked to the latest Evangelical gay sex scandal, Saddam’s WMD program, Ed Whitacres sexual preferences, and the price of soybean futures.

The trouble with citizen efforts at skilled professions isn’t a dearth of data, it’s the inability to interpret the data according to rational standards.

This is going to be fun to watch.

Techdirt reader explains the Internet

Finally, after all these years, I understand the Internet thanks to a comment on Techdirt:

Woot! First! by Rstr5105 on Nov 2nd, 2006 @ 8:00pm

This appears to be yet another case of the telcos trying to tell us how the internet is supposed to be withot bothering to take a second to trace the roots of the net.

For those of us that don’t know, the internet started as a way for universities to transmit data back and forth faster than the ol’ sneaker net method. This worked well so DARPA signed on and funded it for a while. Eventually the DoD built it’s own net, and DARPA funding ceased.

It was at this point that AT&T (as well as a few others) signed on and formed the W3C (World Wide Web Consortium(Don’t quote me on the consortium part) The W3C stated very clearly that the internet was to be used specifically for non-commercial gain. (IE Even E-Bay would not be allowed to operate under the original paramaters of the W3C.)

Then the Internet went public, I believe, although I’m not sure if this is correct, it started with a few professors and business men saying something along the lines of “Hey, this is a good thing, now if only I could connect to my computer at work from my computer at home”. It spiraled out from there.

I don’t know what caused the massive build up of the web that we saw in the nineties, but now everyone is “On Line” and looking to make a few bucks. It seems to me that although we have this powerful tool at our disposal, we are corrupting it by allowing it to remain in the hands of the telco’s.

It also seems to me, that under the terms of the original W3C, (I don’t know what it’s current rules are) the telco’s weren’t allowed to charge for the ability to connect to the net. YES, they had to run the cables to feed it, YES they have to run the servers we all log into and NO i don’t have a problem paying them to be able to connect to the net, but it seems against what the net started as for them to be able to say, “Unless you pay this much a month you’re going to be limited to seeing websites at a slower speed than somebody who pays $XX.YY a month.”

Okay sorry for the long post, but it’s my two (four?) cents on this issue.

Don’t quote me on that, of course, because none of it is true. This comment is an illustration of how net neutrality became a political issue in the US in this election year: a bunch of drolling morons have been empowered to run around spouting spew and not enough people are shooting them down. And where would you start anyway?

Deregulator’s Essay

The Progress and Freedom Foundation has published an essay based on the comment that the great Alfred Kahn originally left on their blog. It’s eminently worth reading, as we’ve said before, and here’s the conclusion:

Why all the hysteria? There is nothing “liberal” about the government rushing in to regulate these wonderfully promising turbulent developments. Liberals of both 18th and 20th–and I hope 21st–century varieties should and will put their trust in competition, reinforced by the antitrust laws–and direct regulation only when those institutions prove inadequate to protect the public.

There is no need to rush in and start regulating the Internet based on nothing but suspicion that bad things are in the offing. When and if we see some actual bad practices on the part of the telcos (or on the part of Google and Yahoo, let’s be fair) Congress can take appropriate action, whatever that is. Acting on the basis of suspicion, and with a heavy regulatory hand, will only harm the Internet. And we don’t want to do that, right? So chill, people.

The great deregulator speaks on net neut

Alfred Kahn deregulated airlines and trucking in the US, and he’s not feeling the love for net neutrality regulations:

Some 25 years ago, I thought it was logical to try to prevent cable television companies, as beneficiaries of exclusive territorial franchises, from discriminating against unaffiliated suppliers of programming in favor of their own by prohibiting broadcasters holding a financial interest in the programs they carried. I eventually recognized, however, the public benefits from the especial incentives of the several broadcasters to produce programming of their own, as well as to bid for independent programming, in competition with one another; and that that competition sufficiently protects independent providers from discrimination or exploitation. If Google and eBay depend upon the telephone and cable companies for reaching their audiences, that dependence is mutual: what would happen to the willingness of subscribers to sign up for DSL or cable modem service if one or the other of those suppliers decided not to carry Google or eBay?

Demonstrably, those broadband facilities have to be created by investments — especially huge ones by the telephone companies — and applications requiring priority transmission can entail lower priority transmission of others. Except as broadband service is subsidized by governments — a possibility I do not exclude — those costs must be collected from users — subscribers to broadband services, on the one side, providers of programming or content on the other, or some combination of the two — just as in the case of newspapers or television stations.

Why all the hysteria? There is nothing “liberal” about the government rushing in to regulate these wonderfully promising turbulent developments.

If you’re interested in the Internet’s future, read the whole thing, it’s a comment on the Progress and Freedom Foundation’s blog.

Microsoft out of It’s Our Net, for now

Broadcasting and Cable has this statement from Microsoft about that company’s dropping out of the ironically named “It’s Our Net, Not Yours” regulatory coalition:

“Microsoft has withdrawn its name from the It’s Our Net website for the pendency of the AT&T-Bellsouth merger proceeding based on a company decision not to engage the proceeding,” the company said in a statement. “However, we continue to support and will pursue other opportunities to obtain meaningful Network Neutrality policies.”

Google and its minions are trying to use the Justice Department to advance their anti-democratic net neutrality program, and even for Microsoft that’s going too far. Let’s hope they never re-join.

Scott Cleland and PFF had noticed Microsoft’s name was gone from the It’s Our Net website, and this is why.

Does the Internet need saving?

Doc Searls is writing a follow-up on last year’s Saving the Net piece and he wants your suggestions:

So I just decided I’ll run a first aniversary follow-up on the piece, over at Linux Journal. But first I’d like to hear from the rest of ya’ll. Tag your posts savingthenet and I’ll find them.

Mine is simple: what makes us think the Internet needs saving? All the empirical measures say it’s thriving: there are more users than ever before, more web sites, more blogs, more broadband, lower prices, and more ways to get broadband thanks to EVDO, public WiFi, and WiMax (coming soon to an ISP near you).

The biggest and only threat to the Internet is the misguided attempt to regulate ISPs in order to prevent the imaginary threat to the imaginary principle of net neutrality, but it’s unlikely to go anywhere, even if the Dems take back the Senate.

I’d be looking at things like terrorist and criminal uses of the Internet, including spam and phishing, because we’re more likely to see a real encroachment on personal freedom of expression over the Internet in response to the real abuses of bad actors than for any other reason.

But the bottom line is that the Internet is fundamentally healthy, and anybody who tells you otherwise probably has a personal agenda because the only way to sustain the “Internet at Risk” argument is to give more weight to the future than to the present. And as we’ve been hearing “Internet at Risk” arguments for ten years (if not longer) and nothing of that nature has come to pass, it’s simply crying wolf at this point, so get back to me when you have evidence of harm and not just imagination.

A system of exploitation

I wish I’d said what Nicholas Carr said about Web 2.0:

Web 2.0’s economic system has turned out to be, in effect if not intent, a system of exploitation rather than a system of emancipation. By putting the means of production into the hands of the masses but withholding from those same masses any ownership over the product of their work, Web 2.0 provides an incredibly efficient mechanism to harvest the economic value of the free labor provided by the very, very many and concentrate it into the hands of the very, very few.

Damn that’s good.

But what do I know? Professor Lessig says Carr is stuck in the 20th century, which sounds sort of painful.

Dirty Money

The Guardian reports that Google has set up a PAC and a high-dollar lobbying arm to protect its network subsidies:

While Google would not be hit directly by a two-tier net, its recently acquired online video site YouTube would, and Google fears that splitting the internet could hamper the creation of other innovative businesses.

“Net neutrality is the most obvious issue for us,” says Reyes, who worked at the US state department before joining Google. “But … Congress and the government are going to take on a whole range of issues that affect us on technological fronts, on legal fronts. This is our effort to play in that game.”

Google has an impressive list of players on its team. As well as counting Al Gore among its senior advisers, Google’s Washington office was set up about a year and a half ago by Alan Davidson. A well-known Democrat sympathiser, he served for eight years as associate director of the Centre for Democracy and Technology, a thinktank that opposes government and industry control of the web. Alongside him is Robert Boorstin, a former Clinton foreign policy aide from the Centre for American Progress, as Google’s communications chief in the capital.

Google’s PAC will be run by a five-person board of directors who will be guided by the recommendations of an advisory committee made up of Google employees. It will raise its funds through voluntary donations from staff.

But judging from the fact that in the past Google employees have been involved with leftwing groups such as MoveOn.org, it will be very interesting to see where that cash is headed.

Towards the end, they tick off some of the search monopolist’s more dubious practices. Is this “don’t be evil?”

Dirty corporate money is a cancer on our political process, no matter who it comes from, folks. Google’s influence-buying operation is bad for freedom.