Been getting messages due to the latest Google Search Console Update showing Index Coverage Issue Errors and Warnings? You are not the only website owner facing this problem (particularly if you are using the popular Yoast SEO Plugin and its sitemap functionality). So let’s get to work and find out how to fix index coverage issues.
What is Index Coverage Issue?
Basically (and usually) this is Google telling you
Hey, you told us to index your website by submitting a XML sitemap, through which we analyze your website URL structure, but when we access that URL you told us about in your XML sitemap. We are having issues seeing the contents of that URL = INDEX COVERAGE ISSUE
How to Fix Indexation Coverage Issues
Easy, just make sure that your submitting XML sitemap does NOT contain URLs which are either
Sending noindex directive (usually through using meta tags) which look like this:
<meta name="robots" content="noindex">
Or are URL’s blocked by robots.txt file but those URL’s are in the XML Sitemap you’ve submitted to Google Search Console
Fixing Index Coverage Issues Due to URL’s blocked by robots.txt
All you need to do is modify your robots.txt file and simply delete the line which is blocking Googlebot to see what is on that page.
For example: let’s imagine that Google Search Console Index Coverage error was due to Googlebot not being able to access
https :// www.rankya.com/ samplepage
And my robots.txt file had directive like the one below:
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/$
Disallow: /wp-content/cache*
Disallow: */trackback/$
Disallow: /samplepage
Disallow: /comments/feed*
Disallow: /wp-login.php?*
Allow: /*.js*
Allow: /*.css*
Allow: /wp-admin/admin-ajax.php
Allow: /wp-admin/admin-ajax.php?action=*
Disallow: /2016/
Disallow: /*?wordfence_logHuman=*
Allow: /wp-content/themes/rankya/*.css
All I would need to do is just delete the line Disallow: /samplepage and re save the robots.txt file on my server (if using Yoast, you can update your robots.txt file through WordPress Dashboard > Yoast SEO Plugin > Tools > file editor
And the index coverage issue due to robots.txt blockage would go away, as in be fixed. You could follow the same procedure for other URL’s index issues due to robots.txt file
Fixing Index Coverage Issues Due to noindex tags
All you have to do in a case where Googlebot is able to access the URL but is seeing
<meta name="robots" content="noindex">
In this scenario, all you will need to do is one of two things
- Remove the noindex meta tag
- Remove the URL from the XML sitemap
How to Remove the noindex meta tag When Using Yoast SEO Plugin?
This will be dependent on the way in which your website is blurting out the noindex meta tags. If you are using popular Content Management System like WordPress and SEO plugins like Yoast. Then, the way to remove the noindex is through its settings called Search Appearance
replacewithyourdomainnameandcopypasteintoaddressbarofyourbrowser.com/wp-admin/admin.php?page=wpseo_titles#top#post-types
When you are there, anything you set to Yes is shown in the XML sitemap automatically. That means, when you set anything to NO it will remove the entire part of WordPress (posts/pages) from shown in the sitemap.
Warning: do not use these settings to remove important section of your WordPress site (posts pages custom post types which you want to rank in Google for, products for WooCommerce) all of which MUST BE indexed by Google, or else you will not get any website traffic through Google.
Instead, you can go to individual post/page/custom-post-type/categories/tags and press on the gear icon and select Yoast SEO Plugin noindex option for individual WordPress blog post
This will fix index coverage issues due to noindex meta tag because Yoast SEO Plugin will REMOVE that URL from showing in the XML sitemap.
At the end of the day, fixing Google Search Console index coverage issue isn’t that hard at all. Basically, what you need to do is triple check the URL’s which you submitted to Google Search Console XML sitemap and make sure that those URL’s are either not serving noindex directives, OR, those URL’s are NOT blocked by robots.txt file.
Video About Google Search Console > Index Coverage > Error > Submitted URL has Crawl Issue
Keep in mind that crawl issues usually occur due to pages that are no longer on your site. And also, if there are crawl errors for pages that never existed on your site. Then, keep in mind that it is more than likely that some external website has a backlink to your website which Googlebot is following (those external site could also be a web scraper or spammer which is causing those 404 errors shown in Search Console Webmaster Tools).
For such cases, you don’t need to worry or do anything about it as they Google search will update its database and remove such 404 issues
How to Fix 404 Errors in Google Search Console
The above video mainly focuses on creating redirection 301 Moved Permanently. Keep in mind that you should only use the 301 redirection IF the content you are redirecting to, has similar content.
Fixing Search Console Index Coverage 500 Internal Server Error for WordPress Themes
The video above shows the typical WordPress Theme URL and Googlebot complaining about Server Errors 5xx response code. The best way to remedy this error is to serve 404 page not found because there is nothing in that URL that Google should index for WordPress setup.
thank you sir
very helpfully article
Hello Rakesh, I am glad to hear that this post related to Google Search Console Index Coverage Issues has been informational for you. Thank you for spreading the word about RankYa SEO
Do you have any tutorials for Blogger?
Você tem algum tutorial para o Blogger ?
For blogger, check settings for robots.txt in general settings. Although I do not have tutorials for Blogger, understanding the best practices for Google’s New Search Console index coverage issues will help to resolve any issues for index coverage reports.
Thank you .
Thank you for stopping by to comment Hussein, please do check the latest insights I’ve created for Google Search Console index coverage reports