Website Optimization - A practical approachby Ascendro Nov 13, 2013, updated Jan 27, 2014
Even though bandwith increases, website optimization is a bigger issue then ever. Mobile devices don't benefit from modern fiber lanes and web applications get more complex and require more and more ressources to load.
Additionaly loaded ressources may load additionally ressources which couldn't be loaded before (cause they where unknown), making the load even longer.
There are a bunch of books, articles and tutorials out there describing possible improvements as well as tools analyzing it but going through all the items takes time.
Two examples: http://gtmetrix.com/ and https://developers.google.com/speed/.
Most of the items are obvious, some maybe include technical features you didn't know - but one thing is for sure: It will take a good amount of time to implement all of them. Beside bringing knowhow over multiple fields you need a good set of analysis tools and patience.
For the analysis you shouldn't only take web services like gtmetrix.com and Googles Insights but also use your own browsers tools like Firebug (additionally "Googles Insight" is available as a browser plugin, giving valuable recommendations). Web services give you the possibility to test performance from different locations, but only in a local environment you can simulate different scenarious. For example disabling features like gzip compression or limitating your own bandwith.
The easy way
If you have access to the webserver there is one simple thing you can do: Install mod_pagespeed.
The basic installation can be described in four steps:
- Download and install Module
- Checking basic configuration
- Configuring the PageSpeed filters
- Finetuning filters, checking which one works and which may even decrease your performance
This module is very easy to set up and implements most of the recommended standard approaches for improving your performance. Our website took 2.23 seconds to load and only 888ms after the plugin was installed and configured - a great improvement without much efforts!
Very visible is the critical path of dependend items and how he get's flattened out as well as the drastical decrease of filesizes.
The manual way
If you do not have the possibility to install custom modules or you don't want to use a tool like that because it is adding to the complexity of your application or want to have everything optimized from the beginning on - not using real time generation and caching, you need to implement everything manually.
In order to be efficient as well, you should start with the most promising items:
1. Enabling gzip-Transfer:
Doable via VServer configuration or .htaccess file:
2. Using browser caching:
By defining expire dates in your VServer/.htaccess - configuration your browser will save ressources in their cache (Note: Modern browser will do that anyway). This will prevent the reloading of ressources.
<ifmodule mod_expires.c> ExpiresActive On ExpiresDefault "access plus 2 days" </ifmodule>
3. Improve your HTML:
Loading CSS Styles before you load JS will give the browser the possibility to apply said CSS before the JS is loaded.
Putting small scripts and CSS as inline in your HTML document or combining it with other Stylesheets or JS-Files will reduce the amount of parallel requests needed.
4. Minifying contents
If you already use some tools for automatically deploying your project on the production environment, this step will be a valuable one time investment. If not it is still a thing you really should consider to use.
Also pictures can be improved drastically by using a better format and compression. "Save as png" is not everything you need to know - there are different settings to reduce sizes, even if you already chossed the best image format.
Two valuable programs: OptiPNG and jpegOptim which provide lossless optimisation for PNG and JPG formats (also helpful for fast optimisation on the run: http://tinypng.org/).
More and more
The list can go on for a long time but it isn't the scope of the article to provide informations about how to implement each optimisation possibility. The listed items are the most easy ones to implement and should give you the most bang for the buck.
If you want to have a more detailed list you should read the Best Practices of Google Developers.
I did everything but my site is still not fast enough!
We applied the simple and most common optimisations already but are still not satisfied. Before we start now to go blind through a huge list of possible performance improvements we should analyze why our page is slow. This will prevent us to waste time on items which may not improve anything or even increase loading times!
As an example we use our Blog, which also got a good boost through the Apache Module:
Even though there is a nice improvement, the load times are unsatisfying.
In order to find out what causes this times we need to take a look at the complete loading tree and figure out what items are dependend on each other (generating a to long loading path) and which items could be improved, loaded differently.
You can clearly see, that the base page is loaded pretty fast - only additional items not really important for the page contents itself blowing the load time up.
- Piwik - our statistics tool we use to evaluate our campaigns
- Twitter Buttons - for spreading our blog items in the Web2.0
- Facebook Like Buttons - same as the twitter buttons
Piwik already offers a special asynchronous version of its tracking code. (See here)
The Facebook Buttons already where implemented to load after the initial page load (thanks to a Yii Plugin we used: faceplugs). The Twitter buttons on the other side where directly copied from an Twitter button example code and contained each a request to load the widget.js.
By simply deleting this request and putting it one time at the bottom of the page, we now only load the widget.js one time and decrease already page loading.
The results where very satisfying:
Even though the requests still take 4 seconds to be handled totally, the page itself is loaded within 1 second and can be used (without the Facebook and Twitter buttons though)
Before going into an expensive in depth analysis about how to improve the website performance, a simple set of possible improvements can be done immediately. After that, a good analysis can prevent you from wasting time for specific items which may not result into an improvement or even increasing the loading time.
It will increase your page load times but decrease the subjective loaded time felt by the page user - which is the important measurement here.