How does page speed impact SEO?
Google pushes for a faster Web and while the firm claims that page speed is used since years for desktop rankings, they released the “Speed Update” in 2018 and state that it’s now a ranking factor for mobile searches” as well. Professionals debate about its direct impact but the point is that, whatever your goal is, improving it is definitely a great idea for SEO, UX and conversions.
Actually, when we talk about page speed, few topics that can be opened. Are we talking about speed for your visitors or for the bots that scan your pages? How can you monitor the loading times and improve them? Why better response times can improve both the user and the bot experience?
Slow pages impact conversions
Various surveys show that slow pages influence the way users interact with a website and there are many examples. For instance, Amazon stated that “every 100ms of latency cost 1% in sales” while Google calculated that “if their search results take just four tenths of a second more they could lose 8 million searches per day”.
Back in 2016, Google released a survey about the way mobile latencies impact publishers’ revenues and the impacts are quite impressive for platforms with low performances:
- 53% of mobile site visits are abandoned if pages take longer than 3 seconds to load.
- The average load time for mobile sites is 19 seconds over 3G connections.
- Mobile sites load in 5 seconds earn up to 2x more mobile ad revenue than those whose sites load in 19 seconds.
- 1 out of 2 people expect a page to load in less than 2 seconds.
In 2019, Unbounce also produced a report and gives great insights about the way people are less likely to engage with a website that takes ages to load: 45.4% of the respondents mentioned that they will probably not make a purchase and 36.8% claimed that they probably not visit it again.
As you can see, if you improve your loading times, by making your pages lighter, you will improve the user-experience as well as your conversion rate, however, what about your rankings on Google?
And influence the “Crawl Budget”
What is it?
Whenever Google scans your website, it will set “limits” in terms of pages it can crawl per day. For instance, Googlebot, the bot used by the search engine to browse the Web, will spend less time on Get Clicks than on Amazon and it makes sense. If a website is large, popular and frequently updated, Googlebot will scan it more often to provide fresher results to the users. That “limit” is called “Crawl Budget” and page speed impacts it as well.
In order to provide results to the users, search engines need to (mainly) perform three actions:
- Crawl : Scan pages.
- Index : Save the quality ones in a database
- Serve : Provide them to users when they search for something.
It’s a simplified vision of how search engines work but if a site has blocking points that prevent Google to crawl it (step 1), then it impacts the rankings. It’s like a domino effect. Crawling the web is pricey and such as every company, a search engine needs to be profitable, so they allocate the right crawl budget to the right website.
How does Google allocate a crawl budget?
A search engine like Google uses several algorithms to create an index and to rank pages. One of the most well-known algorithm is PageRank which aims at understanding the importance of webpages by calculating the probabilities for an Internet user to land on it. This algorithm helps Google to rank pages but the main takeaway here is that it tries to calculate the importance of a website, its relevancy, and then allocate a crawl budget.
Stone Temple released an interview of Matt Cuts (former head of the web spam team at Google) few years ago where he explained that crawl budget was mainly allocated based on PageRank, and therefore, from the external links.
Other factors that can impact the crawling
There are also factors that can impact the way Google scans pages.
- Accessibility: If Google can not see the content of your pages, or if you prevent it to scan them, it can not crawl your site properly.
- Crawl Errors : If you have too much errors (server errors, not found pages, redirect loops…), search engines tend to decrease the crawl budget, or to stop scanning your site, considering it has a low-value for its users.
- Duplicate Content : If similar pages are accessible from various websites, search engines will consider that it’s a form of duplicate content and might ignore them in order to focus on the ones that will be useful for its users.
- And of course, “Loading Times”.
Benefits of the loading time on SEO
Having a faster website will have positive impacts on user-experience as well as on performances in search results but of course, there are other SEO signals that will have a stronger direct impact on rankings. That said, how loading times can impact your organic performances?
If you remember the domino effect we were discussing earlier, you know that if Google has issues in scanning the content of the pages, it might not index them and therefore, you won’t create organic trafic for and having slow response times is one of these blocking points. The example below comes from Google Search Console and as you can see, starting from August there was a great improvement in terms of time spend downloading a page. It’s quite visible on the charts below.
In July, the time spent downloading a page by Google was around 12 seconds per pages (!) while the average number of pages crawled per day was around 20K. In August, the time spent dropped to 3 seconds which is not perfect yet but still, the crawled pages per day increased to 40K.
Good. But what are the benefits?
- More pages are crawled per day.
- Faster indexing of the fresh content.
- Users find fresh content easier.
- Better allocation of the crawl budget to scan deeper pages.
- Google should better “score” the pages.
- Which should benefit to rankings since the quality of the site is better.
How to measure the loading time?
There are several tools to measure the loading times, however, only Google Search Console or a log analysis will give you the exact time Google spends on crawling a page. If you don’t have access to these tools/data, then you can use the sites below to have a rough idea on the site’s speed for users and some insights on key improvements to be implemented.
Google Page Speed
Google Page Speed is really useful even if it does not provide the exact loading time. However, it provides a score for your mobile and desktop versions as well as insights on the optimisations that should be passed. Google Lighthouse, a Google Chrome extension, can also be quite useful here.
Google Page Speed will analyse several areas like the coding, the cache of the pages, the compressing of the images and will mention the tests that your page have passed successfully. You might also want to have a look at Google Page Speed Guidelines, in “Loading Performances” from the left-side menu, to get details about the analysed data and the way you should fix tests you haven’t passed.
A crawler like Screaming Frog
A crawler is a software that will tend to simulate the way a bot visits and scans pages. There are several tools available in the market but the most popular for SEO is probably Screaming Frog. It’s a paid one but the free version allows you to scan up to 500 pages which can be enough for small websites.
As you can see, it will give you the response time on your site with a breakdown per seconds and the number of URLs. You can also dig into the data of the tool to spot the pages that you should improve first.
Developer Tools from your browser
Your browser (Chrome, Firefox…) also has a feature to get data about the loading time of a page.
Here you will have to do a right click and select “Inspect” to open the console. Then you click on “Network” and open a page to access a breakdown with all the resources that have been loaded. It will also tell you the number of requests, the amount of data transferred and the loading time at the bottom of the console. Of course, there are way much more information and you might refer to the awesome guideline provided by Google to know more about this tool.
Online tools like Web Page Test
There are also plenty of online tools that you can use to test your site. Web Page Test is free and offers the possibility to select a location which can help to understand the differences between users in different cities.
Here the test is ran using a Hong Kong server and results might have been different from another location.
Web Page Test will also provide you insights about the loaded resources such as the Developer Tools does. The great thing is that it provides an easy-to-read overview of the key metrics like to loading time, the time needed after the first byte loads, when the page starts to render… and a comparison between the “First View” which represents a user visiting your site for the first time (cookies free) and the “Repeat View” which represents a users who already visited your page. This site is pretty useful to run various tests from a user prospective and has several features to dig into the data. You can also have a look at the guidelines to have details on the way you could use this tool.
It’s time to load faster
As you can see, page’s speed is a large topic and technical SEO is still an important topic. Search engines keep on pushing for a faster Web by providing tools and other projects like AMP (Accelerated Mobile Pages) for Google or MIP (Mobile Instant Pages) for Baidu.
It’s definitely an asset if your pages load fast since it will influence the user-experience and your performances in search results, especially regarding the freshness and the rate of indexed pages. In a mobile-first world where connections might take longer due to the network, avoiding latencies should also improve users engagement (bounce rate, average time per visits…).