New Web Developer Job Advertised
Recruiting for a new web developer began today. Closing date for the job applications is 28th February 2010, with a start date ASAP....
Published on: 17th February 2010
23rd February 2010
The widespread availability of broadband internet connection means that there is a strong temptation to become less concerned about site load times; relying on the fact that with a 2Mb+ connection the site will still perform adequately almost regardless of how it's designed. This is a slippery path though and avoiding it not only makes for fast sites but also gives a degree of insurance against high-traffic loads and server weaknesses.
We use a mixed approach to getting sites to load as quickly as possible. Our techniques fall into 2 categories; optimised design and sensible cacheing.
When we're constructing we pages we are trying to get the file sizes of all the elements as small as possible. This includes resampling and resizing images so they are a perfect fit and optimised for screen display (as a jpg, gif or png). This means 72dpi and pixel perfect dimensions for the available space.
We also break up the site content into page templates, design files and content files. This means we can re-use the same files again and again without re-loading them (cacheing - see below). It also happens to make for very efficient maintenance practices. Each individual file is then small and perfectly formed.
Sometimes when we're using big script libraries or long css files we will "minify" them before making them available over the web. This means stripping out the padding, line breaks and other formatting elements that help a human to read the code, but serve no purpose as far as a web-browser is concerned. This process can cut file sizes dramatically. Some recent experiences have given 40-60% savings.
Another approach we use to minimise file sizes, particularly on sites with big file sizes and hosting packages that allow it, is to use a "gzip" process to compress the files before they are transmitted over the web. This means that the files are deflated in size wherever possible to tiny parcels of information which is then unzipped on the viewer's computer. Pages can load quicker by 40% or more.
The second part of our optimisation is to use the inbuilt facility within (almost) all browsers to store files on the viewer's computer for re-use later. This is called cacheing. Wherever we have files that are used repeatedly on the site (eg. images, stylesheets, script libraries) we will tell the viewers' browsers to make use of their cache to store and use a local copy of these files. We can set the browser to store these files for months or years. Viewers may choose to clear their cache at any time, but for the duration of a visit this can reduce page load times by up to 90% for sites with lots of reused images like logos and page backgrounds.
The other type of cacheing we do is using "session variables". These again are little bits of information that are stored whole on the viewer's computer and which can be re-used on the site wherever needed. A good example of this is a site where we have a long list of database generated content - recent news for example. In this case the chances are that the news stories won't be updated while the visitor is looking at the site so we can visit the database and get the current stories and hold them in the session memory. Whenever this information is needed again we just use the session version, instead of going back and re-interrogating the database. The savings in time and server performance of this approach can be dramatic, especially where a site has lots of database-driven content (like this one for example). [On this site we use a session variable for the left and right hand columns to speed up its performance, saving upwards of 60% on the page load times once you've visited more than one page.]
Sometimes hosting providers don't allow us to use some of these approaches (like the gzip functionality), and sometimes viewers' computers aren't configured to make use of our optimisation approaches (if they've disabled their cacheing for example), but on the whole all sites benefit from these approaches to building high performance websites.
We test our our optimisation using Firefox's Firebug and Developer Toolbars with Google's Page Speed analyser. We also use a number of other websites that test performance and make suggestions for additional improvements that could be made.
As many of our sites are complicated mixtures of content management systems, events diaries, galleries and many other functions they demand a lot of the server in terms of processing (database interrogation and data processing for instance) and have a lot of components to download, we are always on the look out for ways to optimise their performance.
The benefits of this performance-tested approach aren't limited to giving the visitor a fast-loading site, but also include reduced bandwidth, reduced server loads, more easily scalable designs, easier maintenance, and a considerable degree of "future-proofing".
If you'd like to talk to us about making your site perform more efficiently then please ring me on 01300 320076 or email firstname.lastname@example.org.