How does page speed impact SEO?
If you search the web about the impact of loading time on SEO, you will find different point of view. Some tests state that it has a direct impact, other will say the opposite. Whatever your goal is, working on site’s speed is definitely a great idea for SEO, UX and conversions.
Table of content
- 1 A key for users and bots
- 2 Crawl budget
- 3 Benefits of the loading time on SEO
- 4 How to measure the loading time?
- 5 It’s time to load faster
A key for users and bots
When we talk about site’s speed, there are few sub-topics that can be discussed. Are we talking about speed for users? for bots? using wifi from desktop or using 3G on a smartphone? In this article I will try to give different insights and examples about those different scenarios but let’s first have a look some data.
Many studies are available on the web but DoubleClick released a paper few days ago about the way mobile latency impacts publisher revenue. Here are some takeaways:
- 53% of mobile site visits are abandoned if pages take longer than 3 seconds to load
- The average load time for mobile sites is 19 seconds over 3G connections
- Mobile sites load in 5 seconds earn up to 2x more mobile ad revenue than those whose sites load in 19 seconds
- 1 out of 2 people expect a page to load in less than 2 seconds
Few years ago, Amazon stated that “every 100ms of latency cost 1% in sales” while Google calculated that “if their search results take just four tenths of a second more they could lose 8 million searches per day”.
Site’s speed will also have an impact on your SEO strategy, a direct one on the crawl budget.
What is it?
Whenever a search engine crawls your site, it allows a crawl budget. It means that Google will set limits for each sites to scan pages. This is a a key step because a search engine will basically proceed 3 actions :
- Crawl : Scan pages of your site
- Index : Save them in a database
- Serve : Provide them to users when they search for something
It’s a simplified vision of how search engines work but if a site has blocking points that prevent Google to crawl it (step 1), then it impacts the rankings. It’s like a domino effect.
Google can’t spend the same amount of time crawling a site like Wikipedia, with billions of pages, and a site with few hundred of pages. Besides giving a priority to websites that are widely used by users, crawling cost money and such as every company, a search engine tries to be profitable, so it’s important to allocate the right crawl budget to the right site.
How does Google allocate a crawl budget?
A search engine like Google uses several algorithms to create an index and to rank pages.
One of the most well-known algorithm is PageRank which aims at understanding the importance of webpages by calculating the probabilities for an Internet user to land on it. This algorithm helps Google to rank pages but the main takeaway here is that it tries to calculate the importance of a site, its relevancy, and then allocate a crawl budget.
Stone Temple released an interview of Matt Cuts (former head of the web spam team at Google) few years ago where he explained that crawl budget was mainly allocated based on PageRank, and therefore, from the external links.
Other factors that can impact the crawling
There are also factors that can impact the way it scan pages.
- Accessibility of the pages for search engines : If Google can not see the content of your pages, or if you prevent it to scan a page, it can not crawl your site properly.
- Errors (server errors, not found pages, redirect loops…) : If you have too much errors, search engines tend to decrease the crawl budget, or to stop scanning your site.
- Duplicate content : Why would a search engine keep on crawling the same content accessible to various URL?
Loading time can also be a reason and that’s what we will see below.
Benefits of the loading time on SEO
I do believe that having a faster website will have positive impacts on user-experience as well as on performances in search results but I would say that there are other SEO signals that will have a stronger direct impact on rankings.
That said, loading time has a direct effect on crawling. Remember the domino effect I was talking about earlier?
The example below comes from Google Search Console data and as you can see, starting from August there was a great improvement in terms of time spend downloading a page. It’s quite visible on the charts below.
In July, the time spent downloading a page by Google was around 12 seconds per pages (!) while the average number of pages crawled per day was around 20K. In August, the time spent dropped to 3 seconds which is not perfect yet but still, the crawled pages per day increased to 40K.
Good. But what are the benefits?
- More pages are crawled per day
- Faster indexing of the fresh content
- Users find fresh content easier
- Better allocation of the crawl budget to scan deeper pages
- Google should better “score” the pages
- Which should benefit to rankings since the quality of the site is better
How to measure the loading time?
There are several tools to measure the loading time, however, only Google Search Console and a log analysis on Googlebot will give you the exact time Google spends on crawling a page. If you don’t have access to these tools/data, then you can use the sites below to have a rough idea on the site’s speed for users and some insights on key improvements to be implemented.
Google Page Speed
Google Page Speed is provided by “Who You Know” and is really useful even if it does not provide the exact loading time. However, it provides a score for your mobile and desktop versions as well as insights on the optimisations to be done and details on the ressources.
Google Page Speed will analyse several areas like the coding, the cache of the pages, the compressing of the images and will mention the tests that your page have passed successfully. You might also want to have a look at Google Page Speed Guidelines to get details about the analysed data and the way you should fix tests you haven’t passed.
A crawler like Screaming Frog
A crawler is a software that will tend to simulate the way a bot visits and scanss pages. There are several tools available in the market but the most popular for SEO is probably Screaming Frog. It’s a paid one but the free version allows you to scan up to 500 pages which can be enough for small businesses.
As you can see, it will give you the response time on your site with a breakdown per seconds and the number of URLs. You can also dig into the data of the tool to spot the pages that you should improve first. There are plenty of crawlers, but I have to admit that I’m mainly using this one or Botify for bigger projects.
Developer Tools from your browser
Your browser (Chrome, Firefox…) also has a feature to get data about the loading time of a page.
Here you will have to do a right click and select “Inspect” to open the console. Then you click on “Network” and open a page to access a breakdown with all the resources that have been loaded. It will also tell you the number of requests, the amount of data transfered and the loading time at the bottom of the console. Of course, there are way much more information and you might refer to the awesome guideline provided by Google to know more about this tool.
Online tools like Web Page Test
Lastly, there are also plenty of online tools that you can use to test your site. Web Page Test is free and offers the possibility to select a location which can help to understand the differences between users in different cities.
Here I ran a test with a Hong Kong server and results might have been different using another location.
Web Page Test will also provide you insights about the loaded resources such as the Developer Tools does. The great thing is that it provides an easy-to-read overview of the key metrics like to loading time, the time needed after the first byte loads, when the page starts to render… and a comparison between the “First View” which represents a user visiting your site for the first time (cookies free) and the “Repeat View” which represents a users who already visited your page.
This site is pretty useful to run various tests from a user prospective and has several features to dig into the data.
You can also have a look at the guidelines to have details on the way you could use this tool.
It’s time to load faster
As you can see, page’s speed is a large topic and technical SEO is still an important topic. Google keeps on pushing to go for a faster web by providing tools and other projects like AMP (Accelerated Mobile Pages).
It’s definitely an asset if your pages load fast since it will impact your performances in search results, your users and your online sales if you are looking after an e-commerce site. In a mobile-first world where connections might take longer due to the network, avoiding latencies should also improve users engagement (bounce rate, average time per visits…).
There might be other great resources that could have been listed here, so if you have any good read, feel free to share it in the comments.