Back to Buzz Blog

The science of increasing the speed of your site

By Brian Moloney

February 19, 2008

We have been working to assist some of our larger clients decrease the amount of time it takes for their Web pages to download. Many of these sites are visually rich and complex and require a significant amount of information to be downloaded before the site appears properly.

While it is true that the modem is all but dead and that broadband has taken over, that does not reduce the importance of lean development and "fast-as-possible" downloading. In fact, broadband opens the door for bad code because it makes it less obvious.

And no, this isn't another "optimize your graphics" post. We'll assume that you already know that a 3MB thumbnail image is bad technique. It's also not a "preload images with JavaScript/CSS" post either.

This post provides some of the more technical resources to really get your Web pages cooking.

Each Web page is a conglomeration of many individual files. Every image is its own file (even tiny spacer images sometimes used for fine-tuning layouts). Every "include" is its own file. If you mouseover a graphical menu item and it changes colors, that's two images per menu item. Web pages can have dozens and dozens (and dozens) of individual files that need to be downloaded to the browser.

It turns out that many browsers limit the number of concurrent downloads per domain name - some as low as 2. In essence, this means that only two files can be downloading at once and the rest of the files must wait their turn in a double-file line. Having just one additional domain name to serve files from can decrease page download times in half. The Yahoo! User Interface Blog has a great article about it.

If you're hardcore technical and want a thorough resource for all the techniques to really get your pages through the pipes quickly (including keepalives, sprites, gzip compression and obfuscators) , visit Optimizing Page Load Time at

Bookmark and Share

Categories: Web News