Slowing down the web cause we can

A slow web site will tell you how patient you are.

A couple times a months, I play call of duty online. It is fascinating that when you are really into the game, you forget that the people you are playing with are no where near you. They could be thousands of miles away yet, playing the game feels instantaneous.

The internet is fast. We complain about slow internet on mobile devices, but they are over an order of magnitude faster than the dial-up experience. If the connection lags for a second, the gaming experience is ruined. But sometimes, I really need to access the internet, and I put up with websites taking over a minute to load all the information. The importance of information trumps the amount of time we are willing to wait.

So today, let's see what we can do to slow down a website. There are many stages of communication from the time the user type our website name on the address bar to when the page is fully loaded and ready to be consumed. Let's start from the first stage and see where we can add fluff to slow things down.

DNS Resolution

DNS resolution is the first thing the browser does before starting the process. It takes the name of the website you want to access and sends it to a local DNS nameserver, this one responds with the correct IP address associated with the name. This process is incredibly fast, there is not much we can do to slow this down.

To make this process faster, modern browsers pre-fetch the names from the moment you are typing in the address on the URL. So our chances to slow things down at this level is minimal.

TCP connection with server

When the browser has the correct IP address of our website, it will initiate a TCP connection and send an HTTP request. If our server is next door, it will be super fast. If the server is very far, it will take some time to reach us.

To reach us, the client connection will go through hops. Hops are routers that forward data packets to the next until it reaches its destination. The lower the number of hops, the faster they will reach us. Increasing the number of hops will certainly make us harder to reach. We can increase the number of hops by adding servers as routers, similar to the way proxy servers work. We are only limited by how many routing servers we can afford to put along the way.

It doesn't hurt to host these servers in different countries very far from each other.

If you want to see how much this can affect the speed of a website, try to access a popular website through a slow proxy server.

Stalling the Response

When a browser sends a Request, it expects a response. The ideal server will process the request and issue a response as fast as possible. At this point we have full control. We can decide to not do anything. We can even add this to our code if we want to:


We don't want to stall indefinitely however, or the browser will think there is something wrong with our server and simply timeout the connection. Timeout values varies from browser to browser to user settings. So I would say adding a good a random value from 5 to 15 seconds could be annoying enough.

Finally replying.

Even for websites like TheVerge which serve a whooping 12 megs, the main content to download is a mere 360KB, well it's still a lot. We can make that content much larger without ruining the HTML by simply adding some gibberish as HTML comment:

<?php echo $content;?>
<?php echo $twoMegPictureOfaCatBitmap; ?>

The content can also be printed in chunks, with a sleep timer on every iteration.


Most websites makers do their best to make the above processes as fast as possible. On a fast home connection, most websites are reached almost instantly. The problem happens when the content is present on the page. Here is the place where we can be most creative. Because the main content of the website is present, we can even bother the user while he is trying to consume the content.

We can add interaction to the website, for that we need JavaScript, and we need lots of it. Using scripts from the CDN certainly improves the speed. But A CDN is useless if you still end up with 30 different scripts.

jQuery, bootstrap, angular, yii, optimizely, all those have some use. But let's be careful and add all of them to the page just in case we need them. We make use of small bits of their features here and there just to not be too wasteful.

bootstrap.js will handle toggling elements, angular will handle that search form, jQuery is great for selecting elements, and so on. Remember that we will add these to the head of the document because we want to make sure they are loaded before anything else. Can't afford the user to see the content before it is functional.

CSS and fonts should also be loaded on top. The new trend is to load large images, so let's do the same, and add PNGs as images because they are more faithful to the originals.

Not understanding image formats is one of the reason the web is slow. Even though we learned to move on from bitmaps, some people simply replaced them with PNGs and got it over with.


Advertising is a core part of a website, even when it doesn't generate much money. A small blog that gets 300 to 400 visitors a day don't need ads. You will make chump change. But since we are in the business of slowing things down it doesn't hurt to add a few.

The good thing here is we don't have to do much. Advertisers seemed to have fallen as sleep while the web was evolving. Most ads still use document.write to add content to the website. This makes it impossible to load them dynamically at a later time. Even for those who decide to move from that still make use awful things like nested iframes.

Ads are really a blessing when it comes to slowing down a page. They do their best not only to slow things down, but they ruin the experience for the user. Ads, that play videos or audio are the worst of them all... I just had to say that.

This might seem like a satire to you but these are things that truly slow down the web. Working as a developer, I have seen some of these things implemented in real websites. For example, I have seen web request made during a web request to get static data from a third party, why not just save that data locally? I have seen sleep being used on a web facing script because "it wasn't ok to tie up all the CPU resources on the machine".

These things are usually done with the best of intentions. Sometimes a developer knows that these problems are there but they don't have the authority to fix them. Sometimes the web is slow not because we want to make it slow, not because we can't make it faster, but because some day in the future, we will get to it.


There are no comments added yet.

Let's hear your thoughts

For my eyes only