The future of P2P

Nate Anderson writes an interesting blurb on the P2P Next research project in the Netherlands. The researchers hope to build a platform suitable for live TV delivery over the Internet:

Dutch academic Dr. Johan Pouwelse knows BitTorrent well, having spent a year of his life examining its inner workings. Now, as the scientific director of the EU-funded P2P-Next team, Pouwelse and his researchers have been entrusted with €19 million from the EU and various partners, and what they want in return is nothing less than a “4th-generation” peer-to-peer system that will one day be tasked with replacing over-the-air television broadcasts.

P2P-Next is the largest publicly-funded team in the world working on such technology (though plenty of researchers at Microsoft, IBM, and countless tiny startups are also racing to deliver a better P2P experience), and today the team launched a trial program designed to test its progress to date.

What sets the project apart from the traditional BitTorrent architecture is its focus not on downloadable video, but on live streaming. Current BitTorrent implementations, focused as they are on offering easy access to downloadable content, aren’t well suited to delivering live streaming TV across the Internet, but Pouwelse is convinced that this is the future. There’s “no doubt that TV will come through the Internet in a few years,” he told Ars earlier this week. Obviously, deployment of such a system depends on consumer electronics firms and broadcasters, but Pouwelse’s job is to make sure that the technology is ready when they are.

P2P has a lot of issues and problems as a delivery vehicle for live TV, so I don’t think this is a good approach, but a system that caches popular content in numerous places has the potential to distribute large and popular files with little redundant delivery. The important feature of such a system is its caching capability, however, not its “peer-to-peerness.”

See Torrent Freak for many more details.

TiVo rolling out YouTube support

Another sign of the ongoing convergence is TiVo new software enabling Series 3 and HD customers to play YouTube directly from TiVo in the latest software:

As I’d suspected, TiVo support for YouTube is indeed hidden within the 9.4 software update. Series 3 and TiVo HD subscribers should start seeing the application show up as early as tomorrow (Thursday), though the rollout will be completed over the next few weeks. And in some form of meta-irony, I’ve shot a brief video of YouTube on TiVo… on YouTube.

Switched digital video and TCP remote control are also parts of this release. TiVo is evolving into a bit of a nano data center, albeit very limited one.

Public Knowledge’s new star off the rails

Public Knowledge and Free Press have apparently hired file-sharing enthusiast Robb Topolski in some lofty-sounding role, and he feels compelled to expound on network theory that’s way over his head. I’m trying to correct some of his misunderstandings, but it’s not going well. Here’s what I told him at his new employer’s blog: Continue reading “Public Knowledge’s new star off the rails”

Network World on Martin’s rash order

Network World’s Brad Reed has a pretty good news piece on the order FCC chairman Kevin Martin is trying to sell to the Comission’s Democrats. He quotes one of my favorite people, me:

Network architect and inventor Richard Bennett, who has long been critical of net neutrality advocates, says he has some concerns about the precedent the FCC sets if it votes to affirm Martin’s recommendation. In particular, he worries that the principles in the FCC’s policy statement are far too broadly defined and they will be used to encumber upon traffic management practices that are necessary for ISPs to keep their QoS high for the majority of their customers. Bennett says while ISPs should be barred from engaging in anticompetitive behavior by actively discriminating against rival online content, it should be allowed to slow or even stop transfers that are degrading the Web experience for other users.

“Even in this case where the FCC has banned the used of application-based discrimination, it’s perfectly reasonable for ISPs to discriminate against applications on behalf of a particular user,” he says. “Say you’ve got two customers, and one is using VoIP and the other is using BitTorrent. You’re going to need to give VoIP traffic preference over BitTorrent in order to ensure quality of service.”

I actually said something a little different. I want the ISP to allocate bandwidth fairly among users of a given service tier, and then prioritize within each account. So if the same user is running BitTorrent and Vonage at the same time, I want the Vonage traffic to have priority. Martin’s order would ban that practice, and that would be a Bad Thing.

The fact that Martin is proposing to do just that tells you that the FCC is not ready to impose regulations on ISPs yet. More study is needed, and some public comment on the proposed rules.

Kind of like, you know, a formal rule-making procedure. Hell of an idea, eh?

Technorati Tags:

David Sohn of CDT makes the right points

Commenting on the pending FCC action against Comcast, the CDT’s David Sohn says most of what needs to be said:

In order to engage in enforcement, there needs to be either:

(1) An existing, articulated rule or standard against which to judge behavior;
or
(2) Authority for the enforcement body to adjudicate and issue rulings based on general notions of fairness/equity.

It is difficult to argue that number (1) is present here. The FCC expressly stated that its broadband principles are not rules. If they are not rules, then it is hard to see how the FCC can turn around and try to police violations of them as if they were . . . well . . . rules. Doing so would put the FCC on perilously shaky legal ground.

As for number (2), CDT believes that everyone with a stake in the Internet — which at the end of the day is pretty much everyone, period — should be extremely wary of any assertion of open-ended and highly discretionary FCC jurisdiction over broadband Internet service. Even those who may like what the FCC proposes regarding the Comcast question should consider that they may be far less happy with what some future FCC may do, once the door to largely unguided regulatory action is open. CDT believes that the FCC neither has nor should have open-ended authority to craft policies for the Internet out of whole cloth.

This is the problem with suggesting, as some commentators have, that Internet neutrality concerns could be addressed via case-by-case adjudication and enforcement rather than ex ante rules. You can’t adjudicate and gradually build up a body of common law unless there is some underlying standard to adjudicate against — or unless you have broad authority to make law from scratch. That’s why CDT continues to call for legislation in this area. Having the FCC initiate and craft the entire legal framework, without Congress setting the parameters, cedes too much authority to the agency.

It will be interesting to see how an eventual FCC order, if there is one, addresses the murky legal status of the FCC’s Policy Statement and what legal hook the agency tries to hang its action on.

One other thing I’d add is this: an ideal residential Internet access system needs to be managed in two different but equally important phases:

1) Allocate bandwidth fairly among competing accounts; and then

2) Prioritize streams within each account according to application requirements.

Phase 1 keeps you from being swamped by your neighbor, and keeps you from swamping him, and Phase 2 prevents your VoIP session from being swamped by your BitTorrent session.

The problem with the Comcast Sandvine system is that it skips phase 1 and simply does phase 2, application-level traffic shaping. And the problem with the FCC order that Chairman Martin is floating about is that it makes phase 2 shaping illegal. It’s incredibly useful to manage streams for each user as he would want them managed if he had direct control over them. I think future home gateways will empower users to do this, but in the meantime it’s desirable for the ISP to manage sessions appropriately.

The first rule of regulation should be “do no harm,” and on that basis Martin’s prescription is bad medicine.

FCC Hearing at Carnegie Mellon

Here’s the witness list for the July 21st FCC hearing at CMU:

4:00 p.m. Welcome/Opening Remarks
4:30 p.m. Panel Discussion 1 – The Future of Digital Media
Panelists:

Mark Cuban, Chairman & Co Founder HDNet, Owner – Dallas Mavericks
Jon Peha, Professor, Department of Engineering and Public Policy, and Department of Electrical and Computer Engineering, Carnegie Mellon University
Mark Cavicchia, CEO, Founder & Director, WhereverTV
Matthew Polka, President & CEO, American Cable Association
Jake Witherell, Sim Ops Studios
John Heffner, Conviva
Representative, You Tube

5:30 p.m. Panel Discussion 2 – The Broadband of Tomorrow
Panelists:

David Farber, Distinguished Career Professor of Computer Science and Public Policy, School of Computer Science, Carnegie Mellon University
Rahul Tongia, Senior Systems Scientist, Program on Computation, Organizations, and Society, School of Computer Science, Carnegie Mellon University
Robert W. Quinn, Jr., Senior Vice President – Federal Regulatory, AT&T, Inc.
Rey Ramsey, Chairman & CEO, One Economy Corporation
Rendall Harper, Board Member, Wireless Neighborhoods
Scott Wallsten, Vice President for Research and Senior Fellow, Technology Policy Institute
Marge Krueger, Administrative Director, Communications Workers of America District 13

6:30 p.m. Public Comment Period
8:30 p.m. Adjournment

A live web cast of the hearing will be available to the public on the FCC’s website at: http://www.fcc.gov/realaudio/#jul21 — you may also go to “FCC Meetings” from the homepage and then click on FCC Audio/Video events to access the web cast.

————

One significant detail: Google is breaking its silence on Net Neutrality Phase II by having its YouTube division speak. Another interesting thing is that Prof. Jon Peha gets a second bite at the apple. He’s the guy who made significantly false testimony at the Stanford hearing on the relationship of TCP Resets and BitTorrent transactions. I hope he corrects his former misstatements of fact.

Mark Cuban is always entertaining, but I imagine Prof. Farber will show the most insight.

House Anti-Trust Task Force Hearing on Google

C-Span has the archived video of the Conyers hearing on Google’s proposed ad deal with Yahoo:

House Judiciary Committee Hearing on Internet Competition
Recently, a number of transactions and potential transactions have raised anti-competitive and privacy concerns in the field of online advertising, online search, and web platform interoperability. Rep. John Conyers (D-MI) chairs a House Judiciary Antitrust & Competition Policy Task Force hearing to examine the state of competition with respect to various online markets.

It’s quite long but as a bonus it’s also quite boring. Google maintains there will be no price-fixing because ads are sold in auctions, Microsoft points out that the auctions have a floor price and a subjective quality index.

The smoking gun was produced: Google proposed this deal to Yahoo the day after Microsoft made their tender offer.

Google’s girl, Zoe Lofgren, tried to spin the old “two guys in a garage can take Google down” myth, but I doubt anyone with a room temperature IQ is buying that nonsense.

There was one wild card on the panel, the Ask The Builder guy who seemed overly fond of the sound of his own voice.

Of all the members, Issa gets it the best. And he should, because he actually started and built a successful technology business before going to Washington.

Lofgren and Conyers – what can I say without being rude?

Let’s make data centers obsolete

We currently get most of our Internet content, especially video, from large data centers. The high cost of these data centers, and their data comm lines, is a huge barrier to entry to new content providers. This is why 20% of the Internet’s traffic today comes from a single source. So what options to we network architects have to bring about a shift in the Internet’s content architecture such that a few large companies don’t monopolize content?

One is the approach taken by NADA in Europe to create a universal network of P2P-enabled Nano Data Centers:

NADA is seeking to leverage advancements in Peer-to-Peer technology to connect the Nano Data Centers to enable them to work together to provide services to end users.

The set top box would essentially be split in two – one half facing the end user with all the typical functionality and services, while the other half acts as the Peer, or Nano Data Center.

“They isolate it using virtualization technologies, and that secure compartment is now talking to all the other set top boxes, co-ordinating and shifting stuff around. Each of the set top boxes has plenty of storage in it so we can put them together and build a massive data store for all those YouTube videos, Flickr pictures or whatever. We’re using Peer-to-Peer under the hood to provide a service,” Dr Ott said.

This approach, or something like it, has tremendous promise.

The server farm replacement needs to be an always-on device, separate from display machines like PCs and TV sets, inexpensive, easily expandable, and easily manageable. The devices that most resemble it today are home gateways and set top boxes, and the home gateway is actually a better leverage point than the set top box we have today.

I think I’ll build a prototype and see what happens.

IT Examiner coverage of Innovation ’08

John Oram of IT Examiner does a fair write-up on the Innovation ’08 panel in IT Examiner:

Richard Bennett said he is opposed to Net Neutrality regulations because they shut down engineering options that are going to be needed for the Internet to become the one, true, general-purpose network. Today on his blog, Richard adds “Google has invested hundreds of millions of dollars in server farms to put its content, chiefly YouTube, in an Internet fast lane, and it fought for the first incarnation in order to protect its high-priority access to your ISP.”

Richard continued: “Now that we’re in a second phase that’s all about empowering P2P, Google has been much less vocal, because it can only lose in this fight. Good P2P takes Google out of the video game, as there’s no way for them to insert advertising into P2P streams. So this is why they want P2P to suck. The new tools will simply try to convince consumers to stick with Google and leave that raunchy old P2P to the pirates.”

It’s much more balanced and diligent coverage than the article in The Register.

Sweetness and Light

Cade Metz reminds us that Google is the most virtuous collection of people on Earth in this love-letter in The Register

“This side of the argument said: We were pretty well known on the internet. We were pretty popular. We had some funds available. We could essentially buy prioritization that would ensure we would be the search engine used by everybody. We would come out fine – a non-neutral world would be a good world for us.”

But then that Google idealism kicked in.

Continue reading “Sweetness and Light”