Page is blocked from indexing

Search engines can only show pages in their search results, if those pages are not explicitly blocking indexing. Don't block indexing for content that you want to show up in search results. Lighthouse flags when search engines can't index your page:

Lighthouse audit showing search engines can't index your page
Search engines can't index your page.

What causes this audit to fail

Lighthouse checks for the general headers or tags that affect all crawlers. It doesn't flag headers or tags that block specific search engine bots. For example, the tag below prevents all search engine crawlers from accessing your page:

<meta name="robots" content="noindex"/>

The HTTP response header below does the same:

X-Robots-Tag: noindex

You might also have meta tags that block specific crawlers, such as:

<meta name="AdsBot-Google" content="noindex"/>

Lighthouse doesn't check for bot-specific directives like this. Nonetheless, directives like this can still make your page harder to discover in various ways.

You can inspect your page's response headers via the Headers tab of Chrome DevTools:

The Headers tab
The Headers tab.

Each SEO audit is weighted equally in the Lighthouse SEO Score, with the exception of the manual audit, structured-data audit. Learn more in Lighthouse Scoring Guide.

How to ensure search engines can crawl your page

If you want search engines to crawl your page, remove the HTTP headers or meta tags that are preventing crawlers from doing so.

See Robots meta tag and X-Robots-Tag HTTP header specifications for details about exactly how you can configure your meta tags and HTTP headers to get more control over how search engines crawl your page.

Learn more in Remove code that blocks search engine indexing.

More information

Search engines can't index your site audit source

Last updated: Improve article