PageSpeed Insights (PSI) is the most popular speed testing tool out there.
It’s easy to use, has a clean UI and gives you a ton of suggestions for improvement. Not to mention, it's a Google product, making it the primary test tool for SEOs, webmasters and marketers.
But there’s a problem:
Users often focus on the wrong things when looking at their PSI results.
In this article, you’ll learn why that is and how to avoid making the same mistake.
The overall optimization score that PSI provides makes things very simple.
Green is good, red and orange aren’t. It’s a powerful heuristic that saves mental energy.
Most users don’t want to deal with the intricacies of web performance, so it's useful to have an overall score.
But like other mental shortcuts, the score doesn’t answer important (and nuanced) questions like “How are real users experiencing my site?”
The score is computed with lab data, which doesn’t come from real users.
Instead, PSI uses a predetermined device and network settings to analyze a page. After that, it gives you an optimization score and suggestions for improvement based on a set of best practices.
At the same time, your website users may have older devices or slower network connections, both of which lead to a different experience.
That’s why it’s easy to misinterpret your result.
A green score looks good, but it doesn’t necessarily translate to a better user experience.
This is where it pays to read the fine print next to the Lab and Field Data sections.
As I said, lab data is collected on a predetermined device and network settings.
This type of data is easy to get, as it doesn’t require real users. It’s also great for debugging and finding problems with specific resources (images, fonts, etc.).
At the same time, lab data might not correlate with how people experience your site.
On the other hand, field data is collected from real users.
In PSI, field data comes from a massive dataset full of metrics for real page loads - the Chrome User Experience Report (CrUX).
Put simply, field data captures load times for users when they visit a page.
This allows you to find the connection between speed, user experience and business metrics like conversions.
That’s why Google also uses field data to determine whether your site passes the Core Web Vitals audit.
There’s also another problem:
Unlike the optimization score, the field data (Core Web Vitals) evaluation is done for the previous 28-day period.
As a result, you have to wait a few weeks for the assessment to go from red to green after an optimization. That’s why you should test your site regularly, at least once a month.
The issue here is that users typically expect quick and clear feedback. The optimization score (lab data) provides it, but the field data evaluation doesn’t.
Because of these differences, a page's field data and lab data can differ quite a lot. Philip Walton (engineer at Google) wrote an article on why that happens and how to interpret the discrepancies.
Towards the end, he shares a crucial tip:
Since performance optimization aims to improve the user experience, data gathered directly from users should be your priority.
In some cases, CrUX doesn’t have enough data to accurately represent a website’s performance.
There are other ways to access field data directly from CrUX:
BigQuery - requires a Google Cloud project and SQL skills.
Unfortunately, both methods are time-consuming and require technical expertise.
An alternative is to use the Core Web Vitals report in Google Search Console. It also contains data from the CrUX, i.e., from real users.
While it only shows you information about the three Core Web Vitals metrics, this report is still a great place to find pages with common problems like a bad CLS or LCP.
Another option is to use advanced tools that don’t have Google’s field data but offer other ways to understand your website’s performance.
For example, WebPageTest and GTmetrix only use lab data. However, they let you simulate tests from different locations and on various devices and network settings.
Source - WebPageTest
At the same time, Google Analytics’ “Technology” and “Mobile” reports can show you the users' devices, browsers and network conditions.
By using data from Google Analytics to run personalized tests in WebPageTest or GTmetrix, you can get a better picture of your site’s performance in different scenarios.
DevTools also lets you emulate how slower networks and older devices load your site. You can run a performance audit on a page (right-click → Inspect → Performance → Reload) with a slowed-down CPU and network.
Again, bear in mind that WebPageTest, GTmetrix and DevTools can only give you an approximation, i.e., lab data. Nothing can fully replace field metrics.
Make sure to keep an eye on PSI as your website grows and field data becomes available.
As you can see, testing your site’s performance isn’t so straightforward.
There are numerous ways to test, each one with its pros and cons. If you want a deeper dive, Google has an entire talk on their speed tools:
They also have a detailed infographic on the topic.
On the other hand, if you don’t want to spend hours learning about these tools, just remember this:
You should focus on improving the data collected from real users (field data) when optimizing your site. You should also test your site regularly to make sure visitors are always getting a great experience.
Because we know how crucial field data is, we’re working on a new Telemetry report for NitroPack that will show real-user metrics (incl. Core Web Vitals) for your websites in real-time. Once we release this feature, the metrics will be available directly in your NitroPack Dashboard.
The Telemetry report will visualize how real people are experiencing your website in real-time.
Our goal is to give you:
A clear view of how visitors experience your site;
Instant feedback after website changes so you can iterate accordingly;
A way to monitor your Core Web Vitals for Google’s upcoming algorithm update.
Our team is currently working hard to perfect this feature. Keep an eye on our blog in the following weeks for more details.