Ascendro Blog

Displaying 1-1 of 1 result.

Website Optimization - A practical approach

Even though bandwith increases, website optimization is a bigger issue then ever. Mobile devices don't benefit from modern fiber lanes and web applications get more complex and require more and more ressources to load.

In different articles we talked already about performance improvements in PHP and Javascript but what if they are already fast?

It is very common that, especially on simple sites, the most time is not spent on processing on the server but on the data transfer itself. Connecting to the Server, downloading the HTML, downloading Javascript and CSS ressources, downloading images...
Additionaly loaded ressources may load additionally ressources which couldn't be loaded before (cause they where unknown), making the load even longer.

There are a bunch of books, articles and tutorials out there describing possible improvements as well as tools analyzing it but going through all the items takes time.

Two examples: http://gtmetrix.com/ and https://developers.google.com/speed/.

Most of the items are obvious, some maybe include technical features you didn't know - but one thing is for sure: It will take a good amount of time to implement all of them. Beside bringing knowhow over multiple fields you need a good set of analysis tools and patience.

For the analysis you shouldn't only take web services like gtmetrix.com and Googles Insights but also use your own browsers tools like Firebug (additionally "Googles Insight" is available as a browser plugin, giving valuable recommendations). Web services give you the possibility to test performance from different locations, but only in a local environment you can simulate different scenarious. For example disabling features like gzip compression or limitating your own bandwith.

The easy way

If you have access to the webserver there is one simple thing you can do: Install mod_pagespeed.

The basic installation can be described in four steps:

Depending on your web application, not all filters can be applied. Some will destroy your code and others just have an negative impact on your performance. As our own website (ascendro.de) requires javascript in order to work, we had to disable the deferring of javascript loading for getting a better performance.

This module is very easy to set up and implements most of the recommended standard approaches for improving your performance. Our website took 2.23 seconds to load and only 888ms after the plugin was installed and configured - a great improvement without much efforts!

Performance optimisation due to Pagespeed

Very visible is the critical path of dependend items and how he get's flattened out as well as the drastical decrease of filesizes.

The manual way

If you do not have the possibility to install custom modules or you don't want to use a tool like that because it is adding to the complexity of your application or want to have everything optimized from the beginning on - not using real time generation and caching, you need to implement everything manually.

In order to be efficient as well, you should start with the most promising items:

1. Enabling gzip-Transfer:

Doable via VServer configuration or .htaccess file:

<ifmodule mod_deflate.c>
AddOutputFilterByType DEFLATE text/text text/html text/plain text/xml text/css application/x-javascript application/javascript text/javascript
</ifmodule>

More Informations

2. Using browser caching:

By defining expire dates in your VServer/.htaccess - configuration your browser will save ressources in their cache (Note: Modern browser will do that anyway). This will prevent the reloading of ressources.

<ifmodule mod_expires.c>
ExpiresActive On
ExpiresDefault "access plus 2 days"
</ifmodule>

More Informations or even more

3. Improve your HTML:

Loading CSS Styles before you load JS will give the browser the possibility to apply said CSS before the JS is loaded.

Putting small scripts and CSS as inline in your HTML document or combining it with other Stylesheets or JS-Files will reduce the amount of parallel requests needed.

More Informations

4. Minifying contents

If you already use some tools for automatically deploying your project on the production environment, this step will be a valuable one time investment. If not it is still a thing you really should consider to use.

Removing unused parts of Javascript and CSS as well as minifying the contents will drastically decrease the amount of bytes which needs to be sent via the network.

For minifying and combining Javascript we can recomment the Open Source tool closure-compiler . As a java program with a simple command line interface it can be easily included in any process.

Similar for JS there is closure-stylesheets for CSS - though we didn't use it yet as our designers are currently researching other CSS improvement tools (they wrote about it here).

Also pictures can be improved drastically by using a better format and compression. "Save as png" is not everything you need to know - there are different settings to reduce sizes, even if you already chossed the best image format.
Two valuable programs: OptiPNG and jpegOptim which provide lossless optimisation for PNG and JPG formats (also helpful for fast optimisation on the run: http://tinypng.org/).

More and more

The list can go on for a long time but it isn't the scope of the article to provide informations about how to implement each optimisation possibility. The listed items are the most easy ones to implement and should give you the most bang for the buck. 

If you want to have a more detailed list you should read the Best Practices of Google Developers.

I did everything but my site is still not fast enough!

We applied the simple and most common optimisations already but are still not satisfied. Before we start now to go blind through a huge list of possible performance improvements we should analyze why our page is slow. This will prevent us to waste time on items which may not improve anything or even increase loading times!

As an example we use our Blog, which also got a good boost through the Apache Module:

Even though there is a nice improvement, the load times are unsatisfying.

In order to find out what causes this times we need to take a look at the complete loading tree and figure out what items are dependend on each other (generating a to long loading path) and which items could be improved, loaded differently.

You can clearly see, that the base page is loaded pretty fast - only additional items not really important for the page contents itself blowing the load time up.

There are:

  • Piwik - our statistics tool we use to evaluate our campaigns
  • Twitter Buttons - for spreading our blog items in the Web2.0
  • Facebook Like Buttons - same as the twitter buttons

All javascript requiring more and more ressources after they where loaded. We remember initially that it was possible to defer Javascript on after page load - in order to get the page itself faster. We disabled that functionality in mod_pagespeed cause we have a high demand on the javascript.

Still now we see what exactly is reducing the page load time drastically on our blog. The solution: Defer unimportant Javascript like social buttons and statistics.

Piwik already offers a special asynchronous version of its tracking code. (See here)

The Facebook Buttons already where implemented to load after the initial page load (thanks to a Yii Plugin we used: faceplugs). The Twitter buttons on the other side where directly copied from an Twitter button example code and contained each a request to load the widget.js.

By simply deleting this request and putting it one time at the bottom of the page, we now only load the widget.js one time and decrease already page loading.

Using template code from Google Developers we then proceeded with putting all unneeded JavaScript calls in an asynchronous call executed after the page itself loaded.

The results where very satisfying:

Even though the requests still take 4 seconds to be handled totally, the page itself is loaded within 1 second and can be used (without the Facebook and Twitter buttons though)

Summary

Before going into an expensive in depth analysis about how to improve the website performance, a simple set of possible improvements can be done immediately. After that, a good analysis can prevent you from wasting time for specific items which may not result into an improvement or even increasing the loading time.

Deferred javascript triggers the onload event earlier, but still needs to load the javascript ressources and do the related actions. If JavaScript is a major part of the pages appearance it should not be deferred.

It will increase your page load times but decrease the subjective loaded time felt by the page user - which is the important measurement here.

LEAVE A COMMENT

Displaying 1-2 of 2 results.

Michael says:

Yes sure, i talked about this mod in the first chapter of this article.

chrlvclaudiu says:

Nice article! One question though: have you ever tried google's pagespeed_mod module for apache ? It would be nice to see some benchmarks with this module.