Speed tooling evolutions: highlights from Chrome Developer Summit 2019

New performance metrics, updates to PageSpeed Insights and Chrome User Experience Report (CrUX), and more.

Elizabeth Sweeny
Elizabeth Sweeny

At Chrome Developer Summit, Paul Irish and I announced updates to Lighthouse—Lighthouse CI, new performance score formula, and more. Along with big Lighthouse news, we presented exciting performance tooling developments including new performance metrics, updates to PageSpeed Insights and Chrome User Experience Report (CrUX), and insights from the Web Almanac's analysis of the web ecosystem.

New performance metrics

Measuring the nuances of a user's experience is the key to quantifying the impact it has on your bottom line and tracking improvements and regressions. Over time, new metrics have evolved to capture those nuances and fill in the gaps in measuring user experience. The newest addition to the metrics story are two field metricsLargest Contentful Paint (LCP) and Cumulative Layout Shift (CLS)—which are being incubated in W3C Web Performance Working Group, and a new lab metricTotal Blocking Time (TBT).

Largest Contentful Paint (LCP)

Largest Contentful Paint (LCP) reports the time when the largest content element becomes visible in the viewport.

Before Largest Contentful Paint, First Meaningful Paint (FMP) and Speed Index (SI) served to capture the loading experience after the initial paint, but these metrics are complex and often do not identify when the main content of the page has loaded. Research has shown that simply looking at when the largest element on the page is rendered better represents when the main content of a page is loaded.

The new Largest Contentful Paint metric will soon be available in Lighthouse reports and in the meantime you can measure LCP in JavaScript.

Total Blocking Time (TBT)

Total Blocking Time (TBT) metric measures the total amount of time between First Contentful Paint (FCP) and Time to Interactive (TTI) where the main thread was blocked for long enough to prevent input responsiveness.

A task is considered long if it runs on the main thread for more than 50 milliseconds. Any millisecond over that is counted towards that task's blocking time.

A diagram representing a 150 millisecond task which has 100 miliseconds of blocking time.

The Total Blocking Time for a page is the sum of the blocking times of all long tasks that occured between FCP and TTI.

A diagram representing a five tasks with 60 miliseconds of total blocking time out of 270 milliseconds of main thread time.

While Time to Interactive does a good job of identifying when the main thread calms down later in load, Total Blocking Time aims to quantify how strained the main thread is throughout load. This way, TTI and TBT complement each other and provide balance.

Cumulative Layout Shift (CLS)

Cumulative Layout Shift (CLS) measures visual stability of a page and quantifies how often users experience unexpected layout shifts. Unexpected movement of content can be very frustrating and this new metric helps you address that problem by measuring how often it's occurring for your users.

A screencast illustrating how layout instability can negatively affect users.

Check out the detailed guide to Cumulative Layout Shift to learn how it's calculated and how to measure it.

The new Lighthouse performance score formula will soon de-emphasize FMP and FCI and include the three new metrics—LCP, TBT, and CLS—as they better capture when a page feels usable.

In Lighthouse v6 First Contentful Paint, Speed Index, and Largest Contentful Paint are the main load performance metrics; Time To Interactive, First Input Delay, Max Potential First Input Delay, and Total Blocking Time are the main interactivity metrics; And Cumulative Layout Shift is the main predictability metric.

Check out Lighthouse performance scoring and the new web.dev metrics collection to learn more.

Field data (CrUX) thresholds adjusted in PageSpeed Insights

Over the past year we have been analyzing web performance from the field via Chrome User Experience (CrUX) data. With insights from that data we reassessed the thresholds that we use to label a website "slow", "moderate", or "fast" in field performance.

Two bar charts showing the distribution of slow, fast, and moderate speed for FCP and FID.

In order to get an overall assessment for a site, PageSpeed Insights (PSI) uses a certain percentile of the total distribution of field data as the golden number for that site; the previous thresholds used were 90th percentile for First Contentful Paint and 95th percentile for First Input Delay (FID).

For example, if a site has an FCP distribution of 50% fast, 30% moderate, 20% slow, the 90th percentile FCP is in the slow section, making the overall field score for the site slow.

This has been adjusted to have a better overall distribution across websites and the new breakdown is:

Metric Overall Percentile Fast (ms) Moderate (ms) Slow (ms)
FCP 75th percentile 1000 1000-3000 3000+
FID 95th percentile 100 100-300 300+

For example, now if a site has an FCP distribution of 50% fast, 30% moderate, 20% slow, the 75th percentile FCP is in the moderate section, making the overall field score for the site moderate.

Canonical URL redirects in PageSpeed Insights

To enable you to measure the user's experience as accurately as possible, the PageSpeed Insights team has added a reanalyze prompt to PSI. For sites that are redirected to a new URL, you're prompted to rerun the report on the landing URL for a more complete picture of your actual performance.

PSI user interface showing the URL redirect and the 'Reanalyze' button

CrUX in the new Search Console Speed report

Search Console rolled out their new Speed report a week before Chrome Dev Summit. It uses data from the Chrome User Experience Report to help site owners discover potential user experience problems. The Speed report automatically assigns groups of similar URLs into "Fast", "Moderate," and "Slow" buckets, and helps prioritize performance improvements for specific issues.

Search Console Speed report.

Web Almanac

Dion Almaer presenting Web Almanac at CDS 2019.

In the opening keynote we announced the launch of the Web Almanac, an annual project that matches the stats and trends about the state of the web with the expertise of the web community. 85 contributors, made up of Chrome developers and the web community, have volunteered to work on the project, which analyzes 20 core aspects about the web addressing how sites are built, delivered, and experienced. Start exploring the Web Almanac to learn more about the state of performance, JavaScript, and third-party code on the web.

Learn more

For more details about performance tooling updates from Chrome Developer Summit, watch the Speed tooling evolutions talk: