In May 2020, Google announced the page experience update to their ranking algorithm.
The goal of this update:
Adding more organic ranking signals for user experience. This update's rollout began in June 2021 and was completed by September 2021.
As part of the page experience signals, the Core Web Vitals now affect SEO. These Core Web Vitals are three metrics representing the load time, visual stability and interactivity of a page.
Largest Contentful Paint (LCP) tracks when the largest above the fold content element appears;
Cumulative Layout Shift (CLS) shows how much unexpected layout shifts (ads, pop-ups, etc.) affect a page’s visual stability;
First Input Delay (FID) measures how long it takes for the browser to begin processing the first user interaction on a page.
It’s essential to optimize these new metrics for your website’s user experience and organic rankings.
Before diving into the specific techniques, it’s worth mentioning Google's announcements about the update.
Since the initial announcement in May 2020, Google changed and clarified a few things:
Google has confirmed that the page experience update is complete as of September 2021;
Initially, the page experience ranking signal applies only for mobile Search;
The Core Web Vitals are only a part of Google’s page experience signals. The other signals are mobile-friendliness, HTTPS and intrusive interstitials;
You don’t need to reach the “good” threshold for all Core Web Vitals metrics to get a ranking boost. This is a vital point (pun intended), so I recommend watching the whole Web Vitals Q&A for more details;
Google uses only field data (real user data) to determine whether a page passes the Core Web Vitals evaluation. As far as we know, lab data doesn't affect organic rankings;
Google is still testing and improving the new metrics. They updated CLS to be more neutral to the time on page. They've also revealed plans for reworking FID and creating a better responsiveness metric.
Google continues to provide information about the Core Web Vitals, so be on the lookout for more details in the following months.
Google released a whole video about the relationship between the Core Web Vitals and SEO:
Here’s the most important quote from the video:
As expected, quality content is still king. At the same time, the Core Web Vitals undoubtedly increase the importance of overall user experience as a ranking factor.
That’s why, if you’re trying to rank in a space where information quality is largely identical, optimizing for the Core Web Vitals can make a big difference. Just remember that nothing can replace having valuable content on your site.
Recently, Google's John Mueller also had this to say:
Put simply, having good Core Web Vitals is about much more than SEO. It's about improving the user experience, which all website owners should try to do.
Recently Google created new tests, reports and extensions to help analyze Core Web Vitals performance.
The most important of these are:
The updated field data assessment in PageSpeed Insights;
The new report in Google Search Console;
The Core Web Vitals Chrome extension.
Chrome’s DevTools and the Chrome User Experience Report (CrUX) can also help you analyze LCP, CLS and FID.
For now, let’s start with PageSpeed Insights.
Google’s PageSpeed Insights (PSI) provides a Core Web Vitals assessment under the overall optimization score for a page.
This assessment is part of the Field Data report. Field data are provided by the Chrome User Experience Report (CrUX).
This information is collected from real users and is based on what they experience on your website. When it comes to search rankings, Google will use these field results.
The “Diagnostics” section in PSI also provides useful information about elements that affect each of the three metrics:
I’ll cover each one in more detail later in this article.
In addition to real-user metrics, PSI also uses lab data to calculate the overall optimization score and give suggestions for improvements.
While useful, lab data is collected on a predetermined device and network settings. At the same time, your site’s visitors might be using slower devices and networks. That’s why you shouldn’t use lab data as a proxy for your site’s actual performance.
Now, in some cases, PSI doesn’t provide a field summary.
This problem occurs when the CrUX hasn’t collected enough field data, which is common for small websites. Fortunately, there are other places where we get our hands on field data.
Google Search Console (GSC) has two new Core Web Vitals reports - one for mobile and one for desktop.
Each report gives you information about the field data for groups of URLs and their performance.
These reports are great for finding common issues across different URLs. That way, you get information about your entire site instead of just one page.
For example, if you have lots of identical product pages where the largest element is an image, the LCP metric will be similar for all of them. In that case, GSC finds LCP problems across all of these product pages.
Also, after fixing any Core Web Vitals problems, you can alert Google by clicking on “Validate fix”.
In short, these new GSC reports are the best way to track Core Web Vitals performance for your entire site.
There are two ways to directly access the CrUX dataset:
BigQuery - requires a Google Cloud project and SQL skills.
Both require more time and effort than simply running a page through PSI or GSC. However, they also provide more ways to organize and visualize the data. For example, BigQuery lets you slice and join data with other datasets.
If you have the time and tech expertise, it’s worth experimenting with both methods. Check out this guide on Core Web Vitals tracking via GA4, BigQuery and Google Data Studio for a starter example.
For a quick Core Web Vitals check, you can use this Chrome extension.
The extension automatically gives you a short LCP, CLS and FID audit.
After a recent update, the extension has a new UI and provides a lot more useful information. For example, the audit compares the page's performance on your device versus its performance for other users.
Again, the real-user data (i.e., field data) for this extension also comes from the CrUX.
After you’ve measured your site’s Core Web Vitals, it’s time to optimize them.
But a quick disclaimer before we begin:
Each website is different and we can’t possibly cover every potential problem here.
Below you’ll find tried and tested techniques for improving web performance. However, your site can also be affected by factors that aren't discussed here. Always analyze your specific problems before implementing any optimizations.
Largest Contentful Paint (LCP) measures the time it takes for the largest above the fold content element to load.
That element can be an image (including a background-image in CSS), video, or a block of text.
Everything below 2.5s is considered a good LCP score. If the largest above the fold element on a page loads faster than that for 75% of all recorded page loads, the page passes the LCP assessment.
Click on the Reload button and wait for Chrome to analyze the page.
Once the report is ready, you’ll find a small LCP icon in the “Timings” section. When you hover over it, it will paint the largest element of the page blue.
You can also use a waterfall chart to see how many resources were loaded before LCP.
Here’s how that looks for the website above:
From here, you can find resources that cause problems and figure out how to improve their load time.
Here are a few ways to reduce your LCP:
Get a better hosting plan. Fast server response time (TTFB) is essential for site speed. If you’re on a slow, shared hosting server, consider upgrading to a dedicated plan;
Implement Critical CSS. Critical CSS means finding the CSS necessary to load above the fold content and inlining it in the head tag. This technique improves actual and perceived performance;
Optimize images. Often the biggest reason for slow websites, images must be compressed, resized and converted to the right format;
CLS measures the effect of unexpected layout shifts on a page.
Unexpected layout shifts occur when content on the page moves around without user input.
A CLS score below 0.1 means a page is visually stable. If that’s true for 75% of recorded page loads, the page passes the CLS assessment.
To compute your CLS score, Google answers two questions:
How much of the viewport did the shift affect?
How far did the elements move during the shift compared to the viewport?
The overall CLS score is the sum total of all individual unexpected layout shift scores. Here's how an unexpected layout shift looks:
PageSpeed Insights can show you which elements contribute to CLS on a page:
Chrome's DevTools also help detect unexpected layout shifts.
Again, right-click on a page you want to analyze and select “Inspect.” Go to “More Tools” and select “Rendering”.
At the bottom, you’ll see a “Layout Shift Regions” option with a checkbox next to it. Select it.
Now every time a layout shift happens, the shifted area will be highlighted.
This CLS generator is also great for finding layout shifts. It computes your overall CLS score and shows shifting areas.
Here are a few optimizations that can reduce layout shifts significantly:
Avoid inserting ads and pop-ups above other content. The GIF above (from creativebloq.com) is a perfect example of why you shouldn’t do this. Inserting content at the top of a page causes everything below to shift, resulting in a bad CLS score;
Add width and height attributes to images and videos. These attributes help the browser allocate the correct amount of space for each element in advance. This reduces layout shifts significantly;
Reserve space for ads, iFrames and dynamic content. Similar to images and videos, these elements can also cause layout shifts if they don’t have reserved space. Use containers with proper dimensions and the overflow: hidden property to ensure the content doesn’t overflow its container;
Optimize font delivery. Using link rel="preload" and font-display: optional in combination can prevent layout shifts and flashes of invisible text. Check out this article for more information on how to do that.
FID measures how long it takes for the browser to begin processing the first user interaction on a page.
This metric tracks delays only after distinct actions like clicks or taps. Scrolls and zooms don’t affect FID.
To be in the green zone, a page's FID should be less than 100ms for 75% of all recorder page loads.
Now, FID tracks the delay after only the first input. Why is that?
Well, first impressions are everything on the web. Users instantly leave and, in most cases, don’t return if a website frustrates them on their first visit.
That’s why keeping a low FID is crucial.
To find these tasks, open a page you want to analyze, right-click and select “Inspect”. From there, open the “Performance” panel and reload the page. Next, click on “Main” and open the “Bottom-up” analysis.
You can find Long Tasks (all tasks longer than 50ms) in the “Main” section. These tasks are painted in gray with a red overlay.
The “Bottom-Up” analysis lets you group files by URL and find exactly what causes delays.
In the example above, the first script alone takes 164ms to execute, which is way too long.
As you dive into specific tasks, you might find that some longer functions make up most of the delay. In other cases, functions might be run quickly but adding too many of them together in a single task results in a Long Task.
WebPageTest’s processing breakdown is another great way to find problem areas.
Here’s what you can do to improve your FID:
Break up Long Tasks. As I said, this should be your primary concern. Long Tasks prevent the Main Thread from responding to user interactions in time. By breaking them up, you can significantly improve the performance of your website;
Minify and compress code files. Minification removes unnecessary parts from the code like whitespace and line breaks. Compression also modifies code files, making them smaller in size. Some hosting and CDN providers implement these techniques by default;
Delay or remove non-critical third-party scripts. Third-party scripts can sometimes prevent your own scripts from executing on time. Consider which scripts provide the most value to the user and prioritize them. In most cases, ad and pop-up scripts aren’t at the top of the list;
Use web workers. Web workers allow you to run scripts in the background without affecting the Main Thread. Moving non-UI operations to a background thread is generally a good practice;
Optimize CSS. While JS is the main villain for FID, CSS is also render-blocking by default. That’s why excessive CSS can also hurt the user experience. Besides implementing Critical CSS and minifying and compressing CSS files, it’s also worth reducing the unused CSS on your site.
The new Core Web Vitals Technology Report sheds some light on how popular content management systems (CMS) perform with regards to LCP, CLS, and FID.
As of August 2021, Shopify and Wix are the clear winners on mobile and desktop, with Squarespace and especially WordPress lacking behind.
According to the report:
Of course, sample size plays a role here, as the number of tested WordPress origins are between 10 and 15 times more than the others. Still, it's good to have a general idea of how different CMS sites perform. It's also great to see that almost all of them are trending in the right direction.
The Core Web Vitals are a big step towards making the web better for more users. And as part of Google's ranking algorithm, it looks like these metrics are here to stay.
That’s why you should continuously monitor them, even if you don’t see specific issues right now. On that note, here’s a quick checklist of things to remember going forward:
Google determines if your site passes the Core Web Vitals audit based on the previous 28-day period. That’s why you should test your site at least once a month;
When testing, focus on field data, as it accurately reflects how real users experience your site;
Use PageSpeed Insights to understand how a specific page performs;
On the other hand, use Google Search Console to find common problems in groups of pages;
For more customization and a deeper understanding of your site's performance, try extracting data from the CrUX with BigQuery or the CrUX API.
Evgeni writes about site speed and makes sure everything we publish is awesome.