Quantcast
Channel: Rick Huby dot com » Rick Huby
Viewing all articles
Browse latest Browse all 10

8 Quick wins for improving page speed

$
0
0

It’s been stated many times before that you only have a small window of opportunity to capture a user’s attention when they visit your website. One factor in grabbing this user and holding on to them is in ensuring the web page they land on loads quickly.

Page download speed is also a factor in Google’s ranking algorithm, making performance even more important now, so ensuring good website performance is paramount.

Most websites can be improved with a few minor tweaks and changes. To this end I’ve compiled a small list of quick fixes to improve your website’s performance.

To help with testing your website I would recommend using Firefox and installing the YSlow and Google Page Speed tools.

List of website performance improvements

  1. Combine CSS and JavaScript files
  2. Minify HTML page and CSS and JavaScript files
  3. GZip/Deflate plain text files
  4. Ensure Javascript is at the bottom of the page
  5. Remove inline CSS and JavaScript
  6. Minimise external lookups (DNS)
  7. Reduce image sizes (without reducing image quality)
  8. Use image sprites where possible

1. Combine CSS and JavaScript files

What’s the issue?

Every file the user downloads comes with a performance overhead – each one has to be looked up which takes time, plus if you have cookies on your website, they all have to be sent along with each file request.

How do I improve this?

Most CSS files can be combined into one master CSS document, as can most JavaScript files.

This can be done manually by copying and pasting the contents of your CSS or Javascript files into one master file for each (master.css, master.js).

You may prefer to keep some files separate for ease of long term maintenance. If so, you can install tools such as minify (PHP), Combres (.NET) or JAWR (Java) which can be configured to automatically combine your resources from multiple sources. Alternatively you can build deploy scripts that combine the resources as part of a deployment process.

Back to top

2. Minify HTML page and CSS and JavaScript files

What’s the issue?

HTML, Javascipt and CSS files written by hand will include a lot of white space to ensure they are both readable and maintable by humans.

Most white space is unnecessary for computers to interpret the code.

As such all the white space in these files is redundant and just adds to the size of the file being downloaded.

How do I improve this?

The previously mentioned Minify tool does have a facility for minifying the HTML page, as well as CSS and JavaScript files prior to output. This is only available for PHP.

For minifying CSS and JavaScript files on the .NET platform you can use Combres. There is also a port of the YUI Compressor library. Java has the JAWR tool which is very similar to Combres.

Back to top

3. Compress plain text files for download

What’s the issue?

By default, HTML, JavaScript and CSS files are uncompressed. The pure text that is written is what is downloaded.

Plain text can however be compressed by about 70% of its original size and then uncompressed on the browser side.

How do I improve this?

There are two common ways of compressing this content: GZip and Deflate.

Both do essentially the same job. In the examples I have restricted the compression to only text based files, as running compression on items which are already compressed (typically video, music and images) is a waste of time and server resource.

Deflate in Apache

Assuming that mod_deflate is enabled, add the following line to your .htaccess file or in the httpd.conf.

AddOutputFilterByType DEFLATE text/html text/plain text/css text/javascript application/x-javascript text/xml

If mod_deflate is not enabled and you have root access to your server you can try this resource: http://www.howtoforge.com/apache2_mod_deflate

GZip in Apache

For an authoritative look at GZip in Apache I’d recommend Uncle Google as I’ve never looked at this on Apache, however from reading around the subject I believe that Deflate is marginally quicker.

GZip/Deflate in IIS

If you use combres (mentioned in point 1) this can deal with the compression of CSS and JavaScript files for you.

If you want to manually deal with this then you are going to need to change some IIS config files. The Small Workarounds website has a good article on GZip in IIS.

Many minification tools such as Minify, Combres and JAWR can be configured to GZIP the contents themselves instead of getting the server to do it.

Back to top

4. Ensure Javascript is at the bottom of the page

What’s the issue?

Browser’s request more than one file at a time, by the HTTP 1.1 specification 2 at a time per hostname. You can potentially speed up your page load by splitting images, CSS and other resources across multiple domains (e.g. request images from images.example.com and css from files.example.com but the page from www.example.com).

However, when JavaScript files are downloading, nothing else is brought down in parallel (even on different hostnames).

How do I improve this?

Simply putting all your JavaScript requests at the bottom of the HTML document, directly before the closing <body> tag will minimise the impact as you can load all of your content in advance of the Javascript functionality.

Some scripts may need to be loaded earlier, but this will need to be a judgement call on the part of the developer.

Back to top

5. Remove inline CSS and JavaScript

What’s the issue?

CSS and JavaScript can be embedded within the actual HTML page (as opposed to being linked to an external file).

This bloats the HTML page size and reduces the browsers’ ability to cache common style and script information, as the CSS and JavaScript will be downloaded for each page request. Using external files allows the browser to just download the information once and reuse for every page request that needs it.

How do I improve this?

Simply ensure that all JavaScript and CSS content is stored within external files.

Ensure you check any third party libraries and widgets which you may include in your page as these often have inline CSS and JavaScript snippets to be included in the page. These can normally be refactored into external files with a little CSS and JavaScript knowledge.

Back to top

6. Minimise external lookups (DNS)

What’s the issue?

There is a time overhead in looking up DNS entries which can add significantly to the page speed. Every new hostname encountered requires a lookup to an IP address, according to Yahoo! this adds 20-120 milliseconds per hostname to the page load.

How do I improve this?

Simply reducing the number of domains that are referenced for page resources (images, CSS, javascript etc) will improve your score on this. However there is a caveat on this because sometimes using multiple domains allows for improved performance as each domain will increase the number of parallel downloads your page can make.

On the one hand referencing files from different domains allows the browser to download more files concurrently, however each domain that needs to be looked up will add an overhead. It really is a balancing act that should be reviewed for each project.

An additional topic to be considered is Content Delivery Networks (also known as CDN’s), but that’s probably for another post.

Back to top

7. Reduce image file size (without reducing quality)

What’s the issue?

Some image formats, especially PNG retain a lot of information that is not required for the image rendering itself.

This data can be removed, leaving the image file smaller but the image exactly the same as it was (also known as lossless compression).

How do I improve this?

YSlow itself comes with a handy link to the Smush.It! website where you can compress your images in this lossless fashion.

There are also tools for doing this on your local machine, some of which you can configure as part of a deployment script.

For WordPress, Joomla and Drupal websites there are also plugins which can be installed to do this for all uploaded images.

Back to top

8. Use image sprites where possible

What’s the issue?

As mentioned previously, every file request comes with a performance overhead which adds up to a slower web page. Most pages are made up of many images which means there can be a significant overhead in requests. Cookies are sent along with every single request, so for any website with lots of cookies and lots of images you can quickly add a lot of data to each page load.

Sprites are derived from computer game animation, one image file contains all the individual frames for an animation. A visible area displays part of the overall image and the image is moved to create the animation effect.

In web terms we are less concerned with the animation (though this technique is used for some animations) as we are with reducing multiple images into one request.

How do I improve this?

CSS allows us to use background images, in many cases these background images can be combined and displayed in ways that only show a small portion of the overall image.

By reducing a pages image requests down to 1 or 2 requests you will speed up the performance.

Montage of sprited elements from travelsupermarket.com

Montage of sprited elements from travelsupermarket.com, all taken from one sprited image.

This example of a sprite reduced the number of requests from 36 to 2 requests and improved performance significantly.

You may ask why this was reduced to 2 requests and not 1. This is because there is a second image file that is used on the website – a 1px square transparent gif. This is used in a number of places as an HTML image, the sprite is then used as a background image on the transparent gif.

Example CSS definitions

.sprite {
	background: url('sprite.png') no-repeat;
}

.someItem {
	background-position: 0 -100px;
}

.someOtherItem {
	background-position: -50px -200px;
}

Example HTML usage

Using this method we could technically reduce an entire websites image content down to 2 image requests (the sprite and the transparent gif). In practise this is not realistic as maintenance and overall file size of the sprite would eventually render it impractical, but if you always think in terms of sprites you can probably reduce a large number of requests.

My personal practise is to group things into ‘sprite groups’, for example all the buttons in one sprite, the navigation in another, one for icons and so on. I would also recommend keeping a Photoshop (or similar) file for each sprite as it will make maintenance much easier in the future.

Back to top

In Summary

There are further things that can also be done, but in experience these items tend to be easy to achieve and have a major improvement on the performance of the page.

There are other improvements that can be made, though some require a little more work than most of the suggestions here. If you want to read around the subject then the following subjects may be of interest:

  • Use of multiple domains
  • CDN/Content Delivery Networks
  • Caching strategies for your website

Viewing all articles
Browse latest Browse all 10

Trending Articles