Advanced Core Web Vitals: A Technical SEO Guide

Advanced Core Web Vitals: A Technical SEO Guide

Well, one recent study cited by Google in a blog post about Core Web Vitals found that mobile web users only kept their attention on the screen for 4-8 seconds at a time.

Read that again.

You have less than 8 seconds to deliver interactive content and get a user to complete a task.

Enter Core Web Vitals (CWV). These three metrics are designed to measure site performance in human experience. The open-source Chromium project announced the metrics in early May 2020 and they were swiftly adopted across Google products.

How do we qualify performance in user-centric measurements?

Is it loading?
Can I interact?
Is it visually stable?
Fundamentally, Core Web Vitals measure how long it takes to complete the script functions needed to paint the above-the-fold content. The arena for these Herculean tasks is a 360 x 640 viewport. It fits right in your pocket!

This war-drum for unaddressed tech debt is a blessing to a lot of product owners and tech SEO professionals who have been backlogged in favor of new features and shiny baubles.

Is the Page Experience update going to be Mobileggedon 4.0?

Probably not.

But when your page passes CWV assessments, users are 24% less likely to abandon page loads. These efforts benefit every source and medium, and most importantly – real humans.

The Page Experience Update
For all the buzz, CWV will be elements in a ranking signal. Expected to roll out gradually mid-June through August 2021, the Page Experience Ranking will be comprised of:

Core Web Vitals.
Largest Contentful Paint.
First Input Delay.
Visual Stability.
Mobile-Friendly.
Safe Browsing.
HTTPS.
No Intrusive Interstitials.
Updated documentation clarifies that the rollout will be gradual and that “sites generally should not expect drastic changes.”

Important things to know about the update:

Page Experience is evaluated per URL.
Page experience is based on a mobile browser.
AMP is no longer required for Top Stories carousels.
Passing CWV is not a requirement to appear in Top Stories carousels.
A New Page Experience Report in Search Console
Search Console now includes a Page Experience report. The fresh resource includes backdated data for the last 90 days.

Search Console now includes a Page Experience report. The fresh resource includes backdated data for the last 90 days.

In order for a URL to be “Good,” it must meet the following criteria:

The URL has Good status in the Core Web Vitals report.
The URL has no mobile usability issues according to the Mobile Usability report.
The site has no security issues.
The URL is served over HTTPS.
The site has no Ad Experience issues, or the site was not evaluated for Ad Experience.
The new report offers high-level widgets linking to reports for each of the five “Good” criteria.

The new report offers high-level widgets linking to reports for each of the 5 “Good” criteria.

Workflow for Diagnosing & Actioning CWV Improvements
First, an important caveat regarding Field vs Lab data.

Field Data is performance data collected from real page loads your users are experiencing in the wild. You may also hear Field Data referred to as Real User Monitoring.

Core Web Vitals assessments and the Page Experience Ranking Signal will use Field Data gathered by the Chrome User Experience Report (Crux).

Which Users Are Part of the Chrome User Experience Report?
Crux data is aggregated users who meet three criteria:

The user opted-in to syncing their browsing history.
The user has not set up a Sync passphrase.
The user has usage statistic reporting enabled.
Crux is your source of truth for Core Web Vitals Assessment.

You can access Crux data using Google Search Console, PageSpeed Insights (page-level), Public Google BigQuery project, or as an origin-level dashboard in Google Data Studio.

Why would you use anything else? Well, CWV Field Data is a restricted set of metrics with limited debugging capabilities and requirements for data availability.

Why Doesn’t My Page Have Data Available From Crux?
Why Doesn’t My Page Have Data Available From Crux?

When testing your page, you may see “The Chrome User Experience Report does not have sufficient real-world speed data for this page.”

This is because Crux data is anonymized. There must be enough page loads to report without the reasonable possibility of the individual user being identified.

Web Core Vitals are best identified using field data and then diagnosed/QAed using lab data.
Lab Data allows you to debug performance with end-to-end and deep visibility into UX. It’s called “lab” as this emulated data is collected within a controlled environment with predefined device and network settings.

You can get lab data from PageSpeed Insights, web.dev/measure, Chrome DevTool’s Lighthouse panel, and Chromium-based crawlers like a local NodeJS Lighthouse or DeepCrawl.

Let’s dive into a workflow process.

1. Identify Issues With Crux Data Grouped by Behavior Patterns in Search Console.
Start with Search Console’s Core Web Vitals report to identify groups of pages that require attention. This data set uses Crux data and does you the kindness of grouping together example URLs based on behavior patterns.

If you solve the root issue for one page, you’re likely to fix it for all pages sharing that CWV woe. Typically, these issues are shared by a template, CMS instance, or on-page element. GSC does the grouping for you.

Focus on Mobile data, as Google is moving to a Mobile-First Index and CWV is set to affect mobile SERPs. Prioritize your efforts based on the number of URLs impacted.

Core Web Vitals Google search console issue groupings.

Click into an issue to see example URLs that exhibit the same behavior patterns.

Save these example URLs for testing throughout the improvement process.

Core Web Vitals Google search console issue groupings with page examples

2. Use PageSpeed Insights to Marry Field Data With Lab Diagnostics.
Once you’ve identified pages that need work, use PageSpeed Insights (powered by Lighthouse and Chrome UX Report) to diagnose lab and field issues on a page.

Remember that lab tests are one-off emulated tests. One test is not a source of truth or a definitive answer. Test multiple example URLs.

PageSpeed Insights can only be used to test publicly available and indexable URLs.

If you’re working on noindex or authenticated pages, Crux data is available via API or BigQuery. Lab tests should use Lighthouse.

3. Create a Ticket. Do the Development Work.
I encourage you as SEO professionals to be part of the ticket refinement and QA processes.

Development teams typically work in sprints. Each sprint includes set tickets. Having well-written tickets allows your development team to better size the effort and get the ticket into a sprint.

In your tickets, include:

User Story
Follow a simple format:

As a < type of user/site/etc >, I want < action > in order to < goal >.

Eg.: As a performant site, I want to include inline CSS for node X on page template Y in order to achieve the largest contentful paint for this page template in under 2.5 seconds.

Acceptance Criteria
Define when the goal has been achieved. What does “done” mean? Eg.:

Inline any critical-path CSS used for above-the-fold content by including it directly in.Critical CSS (read as: that related to node X) appears above JS and CSS resource links in the.Testing URLs/Strategy
Include the grouped example URLs you copied down from Search Console. Provide a set of steps for QA engineers to follow.
Think about which tool is used, what metric/marker to look for, and the behavior indicating a pass or fail.

Links to Developer Document
Use first-party documentation when available. Please no fluffy blogs. Please? Eg.:

Inline Critical CSS, Web.dev
Extract Critical CSS, Web.dev
4. QA Changes in Staging Environments Using Lighthouse.
Before code is pushed to production, it’s often put in a staging environment for testing. Use Lighthouse (via Chrome DevTools or local node instance) to measure Core Web Vitals.

If you’re new to testing with Lighthouse, you can learn about ways to test and testing methodology in A Technical SEO Guide to Lighthouse Performance Metrics.

Keep in mind that lower environments typically have fewer resources and will be less performant than production.
Rely on the acceptance criteria to home in on whether the development work completed met the task given.

Largest Contentful Paint
Represents: Perceived loading experience.

Measurement: The point in the page load timeline when the page’s largest image or text block is visible within the viewport.

Key Behaviors: Pages using the same page templates typically share the same LCP node.

Goal: 75% of page loads achieve LCP in < 2.5 seconds.

Available as: Lab and Field Data.

What Can Be LCP?
The LCP metric measures when the largest text or image element in the viewport is visible.

Possible elements that can be a page’s LCP node include:

elements.
elements inside an tag.
poster images of

Leave a reply

Your email address will not be published. Required fields are marked *