Index Coverage Issue

Google Search Engine Simplified

Been getting messages due to the latest Google Search Console Update showing Index Coverage Issue Errors and Warnings? You are not the only website owner facing this problem (particularly if you are using the popular Yoast SEO Plugin and its sitemap functionality). So let’s get to work and find out how to fix index coverage issues.

What is Index Coverage Issue?

Basically (and usually) this is Google telling you

Hey, you told us to index your website by submitting a XML sitemap, through which we analyze your website URL structure, but when we access that URL you told us about in your XML sitemap. We are having issues seeing the contents of that URL = INDEX COVERAGE ISSUE

How to Fix Indexation Coverage Issues

Easy, just make sure that your submitting XML sitemap does NOT contain URLs which are either

Sending noindex directive (usually through using meta tags) which look like this: <meta name="robots" content="noindex"> Or are URL’s blocked by robots.txt file but those URL’s are in the XML Sitemap you’ve submitted to Google Search Console

Fixing Index Coverage Issues Due to URL’s blocked by robots.txt

All you need to do is modify your robots.txt file and simply delete the line which is blocking Googlebot to see what is on that page.

For example: let’s imagine that Google Search Console Index Coverage error was due to Googlebot not being able to access https :// samplepage

And my robots.txt file had directive like the one below: User-agent: * Disallow: /cgi-bin/ Disallow: /wp-admin/$ Disallow: /wp-content/cache* Disallow: */trackback/$ Disallow: /samplepage Disallow: /comments/feed* Disallow: /wp-login.php?* Allow: /*.js* Allow: /*.css* Allow: /wp-admin/admin-ajax.php Allow: /wp-admin/admin-ajax.php?action=* Disallow: /2016/ Disallow: /*?wordfence_logHuman=* Allow: /wp-content/themes/rankya/*.css All I would need to do is just delete the line Disallow: /samplepage and re save the robots.txt file on my server (if using Yoast, you can update your robots.txt file through WordPress Dashboard > Yoast SEO Plugin > Tools > file editor

And the index coverage issue due to robots.txt blockage would go away, as in be fixed. You could follow the same procedure for other URL’s index issues due to robots.txt file

Fixing Index Coverage Issues Due to noindex tags

All you have to do in a case where Googlebot is able to access the URL but is seeing <meta name="robots" content="noindex"> In this scenario, all you will need to do is one of two things

  1. Remove the noindex meta tag
  2. Remove the URL from the XML sitemap

How to Remove the noindex meta tag When Using Yoast SEO Plugin?

This will be dependent on the way in which your website is blurting out the noindex meta tags. If you are using popular Content Management System like WordPress and SEO plugins like Yoast. Then, the way to remove the noindex is through its settings called Search Appearance When you are there, anything you set to Yes is shown in the XML sitemap automatically. That means, when you set anything to NO it will remove the entire part of WordPress (posts/pages) from shown in the sitemap.

Warning: do not use these settings to remove important section of your WordPress site (posts pages custom post types which you want to rank in Google for, products for WooCommerce) all of which MUST BE indexed by Google, or else you will not get any website traffic through Google.

Instead, you can go to individual post/page/custom-post-type/categories/tags and press on the gear icon and select Yoast SEO Plugin noindex option for individual WordPress blog post

This will fix index coverage issues due to noindex meta tag because Yoast SEO Plugin will REMOVE that URL from showing in the XML sitemap.

At the end of the day, fixing Google Search Console index coverage issue isn’t that hard at all. Basically, what you need to do is triple check the URL’s which you submitted to Google Search Console XML sitemap and make sure that those URL’s are either not serving noindex directives, OR, those URL’s are NOT blocked by robots.txt file.

Video About Google Search Console > Index Coverage > Error > Submitted URL has Crawl Issue

Keep in mind that crawl issues usually occur due to pages that are no longer on your site. And also, if there are crawl errors for pages that never existed on your site. Then, keep in mind that it is more than likely that some external website has a backlink to your website which Googlebot is following (those external site could also be a web scraper or spammer which is causing those 404 errors shown in Search Console Webmaster Tools).

For such cases, you don’t need to worry or do anything about it as they Google search will update its database and remove such 404 issues

How to Fix 404 Errors in Google Search Console

The above video mainly focuses on creating redirection 301 Moved Permanently. Keep in mind that you should only use the 301 redirection IF the content you are redirecting to, has similar content.

Fixing Search Console Index Coverage 500 Internal Server Error for WordPress Themes

The video above shows the typical WordPress Theme URL and Googlebot complaining about Server Errors 5xx response code. The best way to remedy this error is to serve 404 page not found because there is nothing in that URL that Google should index for WordPress setup.

By RankYa

RankYa digital marketer, website optimizer, content creator, and a fully qualified web developer helping businesses of all sizes achieve greater results online. Based in Melbourne Australia RankYa serves valued clients worldwide by providing personalized services.

We love sharing our proven experience through how to videos and complete courses related to business website marketing, conversion optimization, Google (Search Console, Ads, Analytics, YouTube), SEO, HTML5, Structured Data and WordPress. Thank you for visiting our blog.


    1. Hello Rakesh, I am glad to hear that this post related to Google Search Console Index Coverage Issues has been informational for you. Thank you for spreading the word about RankYa SEO

    1. For blogger, check settings for robots.txt in general settings. Although I do not have tutorials for Blogger, understanding the best practices for Google’s New Search Console index coverage issues will help to resolve any issues for index coverage reports.

Questions? Leave a Comment!

Your email address will not be published. Required fields are marked *