Jacques Chester from Clubtroppo started an interesting discussion the other day with his proposal to use Bittorrent as a remedy against the Digg effect. Chester's idea: Add an X-Torrent header to HTTP that points to a torrent of the page in question.

This would enable web browsers with Bittorrent support or Bittorrent plug-ins to download a page from other users in case a page became really popular on Digg. Over-night popularity on sites like Digg or Reddit can oftentimes knock out smaller sites and blogs, and a distributed approach like the X-Torrent header could help to avoid this so-called Digg effect.

However, the Pirate Bay's Fredrik Neij aka Tiamo thinks the numbers don't add up. Neij did a quick test of the proposed method, downloading a number of web pages and gzipping them to get an idea of how much data we are actually talking about. Turns out an article from Torrentfreak including lots of comments and various images is a little less than 400k, whereas this P2P Blog post adds up to about 24k compressed. From Neij's test results:

"An average announce request will result in a ~500 bytes response, and to complete a download you will contact the tracker at least twice. If you leave the browser open and keep seeding for 2 hours you will talk with the tracker about 6-10 times more per bittorrent file on that tracker. So lets say 10 times on average and that will result in 5kb of data from the tracker. This would result in ~6kb per page (...)."

That doesn't sound like too much data overhead, especially when you're talking about bigger pages like the one from Torrentfreak. However, the whole scenario looks much worse when you look at latency. Users not only need to connect to the original web server, but also to a server that supplies them with the torrent file, a tracker and other peers before the browser can display the page in question. Neij thinks that's way too complicated:

"This will cause at best a 2-3 second delay, and in worst case minute long waits."

For the record, I suggested a different method to offload Digg traffic to Bittorrent. My idea was to bypass the original server and just establish a third-party service that scours Digg for popular Urls, downloads the documents in question, generates torrents and serves these to Bittorrent-equipped browsers. The advantage of such a set-up would be that it would still work even if the dugg server went down, but the latency problem would essentially be the same. Neij's comment on both proposals:

"So in the words of my favorite show, As for using a x-torrent header to prevent the digg effect: BUSTED. However, as a way of offloading servers for distribution of large static files i think it's a excellent feature."

I agree, there's definitely some potential when it comes to combining browsers and Bittorrent to deal with high-traffic situations, and this does make more sense for large files than HTML pages.

I still think there is some viability in a third-party approach, and PPLive's PPVA video accelerator seems to prove that it can work, but that doesn't mean that server-based P2P integration doesn't make sense as well. I guess we'll just have to wait and see what people come up with once P2P browser plug-ins are more widely established.

Tags: , , , , , , ,