For online business owners who want to attract, engage, and convert users, failed Core Web Vitals are a critical warning.
An official benchmark for how enjoyable your website to real-world users is, Google’s Core Web Vitals are a set of three performance metrics:
Together, these metrics offer a standardized way to gauge a site’s performance based on real user interactions. Passing the assessment tells you that your site loads fast, reacts quickly, and doesn’t behave abnormally while users interact with it.
So, it comes as a big surprise when Google’s own Analytics seems to be causing failed Core Web Vitals assessments.
Let’s investigate!
Google Analytics works by tracking user interactions on a website and providing insights into user behavior, traffic sources, and conversions. It is implemented on a website by adding a tracking code snippet provided by Google Analytics.
This JavaScript snippet, also known as the Global Site Tag (gtag.js), goes into your site’s section on every webpage you want to track:
This code collects data about user interactions, such as page views, clicks, and conversions, and sends it to Google's servers for analysis. Website owners can then access this data through the Google Analytics dashboard to gain insights into their website's performance and user engagement.
So far, so good.
Now, let’s go look at what happens under the hood.
Upon closer inspection with disabled cache and no active performance optimizations, we see the gtag.js is loaded as a single HTTP request weighing a miniscule 111kB, along with a Microsoft Clarity gtag of 769B.
As far as initial page load goes, the Google Analytics tracking code displays an expected behavior and doesn’t contribute to excessive HTTP requests, unused JavaScript, or blocked main thread.
Where does the misconception stem from, then?
Adding Google Analytics tracking to your website alone does not risk failing the Core Web Vitals assessment (or Lagest Contentful Paint specifically). This is simply because the snippet placed in the head of web pages is extremely lightweight and doesn’t block any of the vital processes in rendering a page’s content.
So, why are site owners linking failed Core Web Vitals to Google Analytics?
Our experience shows that there’s still a considerable misunderstanding when it comes to reading Google PageSpeed reports. Mainly because of how the report evolved over time.
Let’s clear up the confusion.
After the introduction of Core Web Vitals, the Google team worked hard on shifting the attention to what we call “field data”—now displayed at the very top of your report as the Core Web Vitals assessment.
It is generated with data from real users interacting with your website from the CrUX dataset (aka Chrome User Experience Report).
Core Web Vitals assessment based on field data for amazon.com
Before Google’s Core Web Vitals became the standard for great user experience, we relied on the Performance Score (measured from 0 to 100)—now displayed after the CWV assessment.
Performance Score based on lab data for amazon.com
The reason it got deprioritized is that it didn’t accurately represent what happens when users land on your website. The Performance score is generated with “lab data” from Lighthouse—in other words, these are the results from a simulated environment.
As you can see from the screenshots above, Amazon is passing the Core Web Vitals with flying colors, but in a controlled environment Total Blocking Time (TBT), Speed Index (SI), and LCP issues are flagged for further improvement. That’s a great way to isolate specific issues and work on optimizing them.
However, at the end of the day, what matters most is how real users experience your website, and that’s where your main focus should go first.
In conclusion, if you’re failing Core Web Vitals, Google Analytics tracking is unlikely to be the reason. Instead, make sure you’re not reading lab results instead of field ones, and give your PSI report another scroll to explore the Opportunities and Diagnostics sections.
In reality, site owners rarely go as far as setting up analytics and calling it a day.
Google Tag Manager and Google AdSense are popular tools for online businesses that want to track specific events and run ads on their websites for extra revenue from incoming traffic.
While Google Analytics itself is not a source for performance issues, our engineers at NitroPack always conduct in-depth analyses to identify the real culprits.
Using our earlier example with kiteworks.com, we see that upon interaction with the home page, a chain of extra event tags (gtm.js) from Tag Manager is fired.
And that’s a lot of extra gtm.js tags, hence the excessive number of HTTP requests.
Since the Google Analytics code loads first before everything else, when your website has lots of event tags, you can expect the GA snippet to call all other gtm.js, resulting in loading delays and worsened results in metrics, like:
In your PSI report, such a string will be flagged by the “Reduce unused JavaScript” warning:
And if your Google Tag Manager looks anything remotely like this, it’s time to declutter and reorganize:
Your first step is to delay third-party scripts with async or defer attributes and let them load in the background. These attributes essentially turn scripts into non-blocking and reduce the overall impact of third-party code.
While similar, these attributes have important differences:
Gtm tags load asynchronously by default, but when there are this many, it’s like you have two very long queues of requests—even though they are passing through, they can only pass one at a time and will inevitably have to wait their turn.
By optimizing the number of Google Tag Manager events first and deferring them next, you can ensure the initial load doesn’t suffer from unnecessary delays.
When site owners dedicate ad units in various formats on a web page, Google AdSense provides code snippets (HTML/JavaScript) for each ad unit which are pasted into the HTML of their website's pages where the ads should appear.
When a user visits a webpage containing the AdSense ad code, the browser executes the JavaScript code provided by AdSense, generating revenue for the site owner on impression.
Unfortunately, because of their render-blocking qualities, AdSense ads can impact site performance (and specifically Web Vitals like LCP and CLS).
With NitroPack, site owners can choose to “Optimize Ads” which will delay the JS until a user interaction. But because AdSense works based on impressions, this might translate into some ad revenue losses.
In this case, based on the behavior of your audiences, you should decide what’s more beneficial for your business:
1) optimal performance for better browsing experiences
or
2) generating as much ad revenue as possible but eventually losing traffic to unstable site behavior.
While some site owners consider self-hosting Google Analytics tracking for optimal Core Web Vitals, there is little need to go through with it. Instead of adding complexity, cost, and potential limitations compared to relying on Google's infrastructure, focus on optimizing Google Tag Manager events to meet Core Web Vitals standards.
Lora has spent the last 8 years developing content strategies that drive better user experiences for SaaS companies in the CEE region. In collaboration with WordPress subject-matter experts and the 2024 Web Almanac, she helps site owners close the gap between web performance optimization and real-life business results.