How to Fix Blocked by robots.txt Errors

Search engines find information through crawling (which means they request / fetch a URL) to then analyze what they find on the URL. robots.txt rules should only be used to control the search engine crawling process but not indexing process. This means, most Search Console blocked by robots.txt errors arise due to incorrect rules used… Continue reading How to Fix Blocked by robots.txt Errors

How to Fix Excluded by noindex Tag

There are various methods a website URL can send NOINDEX directive telling search engines like Google to NOT index certain parts of a website. This is to block indexing of URLs. What is Excluded ‘noindex’ tag in Page indexing Report? URL is excluded from Google indexing process due to Googlebot seeing noindex directive. Submitted URL… Continue reading How to Fix Excluded by noindex Tag

Page Indexing

New Google Search Console Page Indexing reports with Excluded and Error issues are now grouped into the status Not indexed. Google Search Console now shows you various issues a website may experience that can affect indexing of certain URLs. And because RankYa has been maintaining full course and how-to videos related to Google Search Console,… Continue reading Page Indexing