Google's new web protocol SPDY

They shouldn't make SSL mandatory. Other than that it sounds nice, but isn't this really a solution looking for a problem to fix? :) I can't imagine people suffer all that much today with websites loading slow, unless you're on dial-up. Websites you visit more than once get cached anyway to a lesser or greater extent, cutting down on the waiting even more, and for the first-time visit surely you can afford to wait for a few seconds for everything to load up?

I'm on ADSL2+, about 1.5Mbit upstream, 12ish Mbit downstream which is pretty fast, but not really close to the fastest you can get, and I can't say I suffer. Heck, I didn't suffer even in the past when I was on standard ADSL 2Mbit downstream, or even 128kbit up/512kbit down. :)
 
a protocol like that should allow front end webserver to handle more concurrent sessions. google probably cares about something like that.
 
Why not just use .mht Files (MIME Encapsulation of Aggregate Documents) if this is an issue?
Its even already supported by a couple Browsers, Opera and IE do it natively and Firefox can use it via plugin.

And regarding the name SPDY - The first thing I thought about it wasnt "speedy" but "spyed" :devilish:
 
Websites you visit more than once get cached anyway to a lesser or greater extent, cutting down on the waiting even more

Unless care is taken from the server to mark an expiry-date on all objects, all cached objects needs to query the server for updates. IE happily ignores this HTTP fact if you have "check for updates" to "automatic", which results in faster, but often incorrect, page loads.

It's not un-common for pages to have an object count past 30 (javascript files, style files, images), this results in a massive number of requests. First, the subsequent requests can only be served after parsing the document HTML, subsequent requests after parsing style sheets (embedded images in styles), parsing and execiting script (Web 2 dynamically loaded content). Second Browsers defaults to a fixed number of concurrent requests to the same server, so these requests are done more or less sequentially.

If you have low latency to the server, you probably won't notice, but if you have 200 ms latency, load time can easily exceed 5 second. You can up the number of concurrent requests, but this puts more strain on the servers because more sessions are created/destroyed, and thus is an unpopular solution. One of Yahoo guidelines for high performance webpages is to pack all style sheets into one file (which is contrary to the concept of cascading styles), similar with script files and to host images on a number of subdomains to minimize number of reqeusts in total and maximize requests per second.

Encapsulating an entire page into one request is the optimal solution, both from a client and a server perspective. You would then only need to request objects exterior to the website (like ads).

EDIT: .mht indeed seems like the correct solution.

Cheers
 
Last edited by a moderator:
Back
Top