GOOGLE MAKES 4 CHANGES TO THE INDEX COVERAGE REPORT

To better inform website owners about indexing issues, the Google Search Console index coverage report will receive a total of 4 updates.

This index coverage report was first introduced in 2018 with the launch of the revamped Google Search Console and is still new compared to other reports offered by Google.
Since the launch of the Index Coverage Report, website owners are already sharing their feedback with Google about improvements they would like to see in the future.
Therefore, the changes to the index coverage report that will be available today are based on feedback from the webmaster community.

“Based on the feedback we have received from the community, we are offering significant improvements to the report. This way you will be better informed about problems that may prevent Google from crawling and indexing your site.”

SEO audit

Changes to the Search Console index coverage report

The list of changes to the index coverage report in the Search Console includes the following:

  • The removal of the generic problem type of “crawl anomaly” – All crawl errors are now assigned to a problem that contains a better solution.
  • Pages that have been submitted but blocked and indexed by robots.txt are now marked as “indexed but blocked” (warning) instead of “submitted but blocked” (error).
  • The addition of a new problem: “Indexed without content” (warning).
  • Soft 404 reporting is now more accurate.

This means there is now no more guesswork when it comes to crawl errors, as the “crawl anomaly” problem has instead been replaced with specific problems and solutions.

Website owners will now clearly recognize and know if a page indexed by Google is blocked by robots.txt, because instead of “submitted but blocked” the message will now show “indexed but blocked”. It is a completely different thing whether a URL has been registered or indexed and the report has now been updated to show this.

The Soft 404 reports should therefore be more accurate. There is also a new problem called “indexed without content”. Let’s take a closer look at this problem together in case it appears in one of your reports.

This is what the Search Console help page tells us about indexed without content:

“This page appears in the Google index, but for some reason Google could not read the content. Possible reasons for this are that your page may be obfuscated for Google or the page may be in a format that Google cannot index. This is not a case of blocking by robot.txt.”

GOOGLE MAKES 4 CHANGES TO THE INDEX COVERAGE REPORT

So if you encounter the problem indexed without content , this means that the URL is in Google’s index, but the web crawler cannot view the content.
This could mean that you have inadvertently published a blank page. Another reason could be that there is an error on your site that is preventing Google from rendering the content.

For further assistance to fix the error without content indexed, our recommendation to all website owners is to run the affected page through the Google URL check tool.
This is because the URL inspection tool renders the page as Google perceives it. This can help to understand why the content is not visible to Google’s web crawlers.

These changes are now displayed in the index coverage report. Site operators may now be able to recognize new types of errors or changes in the number of problems.

Related posts

angebote_englisch