Page speed is now a (small) factor in the mobile-first index, and thus impacts SEO. It affects organic performance and, more importantly, affects user experience. Many people don’t see it much since many websites are already decent in terms of speed, however, it is considered one of the top ten SEO ranking factors.

Speed is important not just for SEO, but also for UX. If a website takes, say, more than 3 seconds to load, then most users will simply leave your site and not look back – according to Kissmetrics.

There are many aspects which you can take to improving page speed, with many people searching “How can I increase my website speed?” on Google, however, there are a few factors which need looking out for (especially some cases where the speed affects UX but not SEO).


TTFB is one of the most important speed signals for Google and users. If they click on a website and it takes more than a few seconds to even respond, then they’ll just go away.

Additionally, it affects organic performance, as slow-loading website is taking time away from Google crawling other websites too, which it does not appreciate.


HTTP/2 is the next version from the widely used HTTP/1.1. The technology utilises multiplexing, meaning that a webpage’s resources load simultaneously rather than consecutively.

This is applicable for every website, and massively helps websites which utilise a lot of separate resources (e.g. lots of images on a webpage), ultimately cutting total load time.

We’ve seen websites cut their total load time by more than 40% with this technology. With how easy it is to implement and how fast it can make your website, there’s no point not implementing the new technology.

A website can deploy HTTP/2 off-the-shelf with the use of an advanced CDN such as Cloudflare, but even without, it is relatively easy to implement on most web servers.


People typically ask things like “does Cloudflare speed up my website?”.

Content delivery networks (CDN) are servers which you can leverage your website onto.

These massively help speed simply by storing static assets on their many global servers.

The CDN will determine the location of the user, and direct it to the closest edge server it can find.

These can be free and can have many different options covering speed and security amongst others, and there is a wide range to choose from, from AWS to Alibaba.

We recommend using Cloudflare, for which we have developed our SEO tool Sloth for, which allows you to carry out A/B testing and much more for free (still in closed sign-ups).

Image compression + formats

Many websites now use visuals to help deliver content to users to provide a better UX and help make their website more captivating.

However, with more visuals usually comes more images – especially large images – and this can result in your webpages having a very high load time because of this.

Most CMSs do provide a tool to do this automatically, meaning you don’t have to worry about doing anything yourself. Some more advanced tools convert these images to more advanced formats like WebP, which utilise greater compression technologies – meaning smaller file sizes.

Content Compression Technologies

Static files like CSS and JS can be compressed by your server software before being transferred over the internet using technologies GZIP or Google’s Brotli.

Files under this compression (assuming they have also been minified — the process of removing unnecessary line breaks from the code) can reduce file size up to 90%, helping reduce total load time.

These only apply to files like CSS, JS, HTML, SVG etc. and unfortunately do not touch images.

DNS prefetch

Browsers must establish a relationship between the IP by performing DNS lookups before starting to transfer files. This means that for every domain you use (e.g.,, it simply adds more time to the total load time.

To combat this, you can try to reduce the number of hosts used or use a DNS prefetch rule in your website’s <head> to pre-emptively establish a connection, helping reduce load time.

Request number

Your browser will make a number of requests to different servers — one request for each file loaded — for each time a new webpage is loaded.

Having many of these (possibly over 100) can affect the crawlability of your website.

Of course, if your page has a lot of content and uses a lot of external elements reasonably then it is acceptable.

However, if it is sitewide then search engines perhaps will not render all requests, which can affect the value of the page.

Additionally, the more it loads, the slower the page loads – negatively affecting technical performance.

Reviewing the requests on the top landing page, for starters, can be used to counter this issue. There may be some redundant requests made by the browser, or there could JS or CSS files which can be combined.

User Journey Speed

Speed, as mentioned above, can massively affect UX without even touching SEO.

This, however, only happens in certain circumstances where Google cannot send requests with its user-agents, but human users can.

For example, a user may want to compare to houses on a house comparison website.

To Googlebot, the entire page has loaded, and everything is fine, however to the user, the API still must return the results, which in some cases can take over a minute.

Something on this stretch of time can severely increase the exit rate and decrease conversions on the website.

Additionally, a user may want to book a hotel but must wait for the API to return the necessary data for them about booking information etc.

There sometimes can be a good chance that the API will take a long time to load – perhaps around 30 seconds.

Due to impatience developed specifically from the internet, users will most likely get frustrated and simply exit the page.

Ultimately, although your organic performance won’t be affected by slow APIs, your conversion rate will.