Two weeks ago I wrote about just how important speed is when it comes to your website, and hopefully shared some useful advice about how to build fast sites that users and search engines love - and deliver on business metrics as a result.

Of course I am not the only person in the world who feels this way. That is one of the reasons why so many online ‘speed test’ tools exist that can run the rule over any given site and give you an idea as to how it performs and what might be improved. A few of the more popular examples would be Pingdom, Google PageSpeed Insights, GT Metrix and WebPage Text but there really are dozens to choose from.

There’s nothing intrinsically ‘wrong’ with these tools. At Kooba we work with them every day. But the results they give need to be treated with a decent amount of caution, and certainly need to be interpreted by somebody with a clear understanding of what they can and cannot tell us. Hopefully this short piece will help provide some of that understanding.

The limits of automated speed testing

It doesn’t take long to realise that there are limits to the accuracy and reliability of these tools once you start using them. For one thing, they can deliver wildly divergent results for the same site when accessed from the same location.

Some of this is an issue with definitions. Is a site fully loaded when every last file has been retrieved, or when there’s enough visible on screen for the user to start reading and navigating? Some tools take the former approach, some the latter (I agree with these ones, but that’s beside the point).

But even allowing for these philosophical differences, the divergence in results can be so significant as to render the headline number almost meaningless. To give one example, I tested aerlingus.com yesterday on Pingdom, WebPageTest and GTMetrix. The results were 3.4, 14.9 and 23.2 seconds respectively.

In a situation like that, I think it’s fair to say you’d get to the ‘truth’ faster by running some real world tests yourself. Even aside from the issue with consistency between tools, it is certainly worth remembering that speed on paper means nothing if you are not seeing the same results in real life. Field testing always has a role to play in establishing what the true user experience is when a site is loaded for the first time.

Interpreting the results

Where these tools can help with is in providing clear guidance as to what might be slowing down a site, and providing recommendations for fixing issues that you might have.

For example, and using Pingdom as an example, the report will give some very clear high level numbers relating to the total size of the site and the size of images, javascript and so on: elements that are often poorly executed and can slow down any given site a lot.

Similarly, Google PageSpeed Insights provides a detailed checklist of potential improvements along with its speed breakdown.

Before getting into the inevitable caveats, it’s important for me to state that these lists and recommendations are useful. I would always recommend that anyone interested in the work of an agency runs some of their sample sites through these engines and takes a quick look at the results. You can learn a lot about how technically competent and sophisticated an organisation is by the quality of its code, which is ultimately what is being measured in most cases.

But at the same time, it has to be understood that the desire for a sea of green ticks can be misguided. It’s vital to realise that these sites all view websites through one particular lens: speed. I don’t have an issue with that, it’s their job after all. But it’s not the sum total of our job. If speed was all that mattered, websites would look very different indeed.

In other words, if we followed every single piece of advice given to us by Google PageSpeed Insights our clients would desert us in droves. Each individual recommendation has to be considered in the round, and we need to understand what we are giving up when we listen to recommendations for speeding up the site.

Where to focus our efforts

Online tools are not humans. The advice they give us is based on algorithms and automated tests. It is largely devoid of context. We need to provide that context ourselves. Here’s a little to get you started.

I spoke earlier about how image size and bloated Javascript are two common challenges when it comes to speed. If a tool like Pingdom is suggesting an issue in this area, you should usually take it at its word and see what you can do to fix the problem.

In the case of image optimisation, this can be a relatively easy fix. In the case of Javascript it may require a deeper dive and (unfortunately) might say something about the quality of work on the site to date.

Similarly, if PageSpeed Insights reports an issue with “Server Response Time”, this is also worth following up. It can often indicate an issue with your hosting platform. Resolving this might mean spending a little more money, but if there’s a material speed impact from your current approach then chances are it will be money well spent.

On the other hand, some common errors can be ignored (as long as you know why you are ignoring them). For example, almost every modern website will fail Google PageSpeed Insights on “Eliminate render-blocking resources”, which essentially means Javascript or CSS calls being made before above-the-fold content has been delivered. Most organisations rightly believe the reduction in UX quality for marginal speed gain is not worth it.

Similarly, the use of third-party marketing platforms will usually trigger a host of ‘issues’ on most site speed analysis tools. This particular issue perfectly summarises the need to monitor your site on an ongoing basis but learn to ignore what needs to be ignored.

Some of these tools will be used everyday and are non-negotiable. However, many fall out of use but are still gumming up site performance. Use automated tools to run an occasional audit, and apply your own judgement to remove anything not in use.

Adopting that general approach will help ensure your site is both fast and functional!


Journal full list