Crawl problems happen when a search engine tries but fails to reach a page on your website. Let's take a closer look at crawling first. Crawling is the technique through which a search engine uses a bot to visit each and every page of your website. When a search engine bot discovers a connection to your website, it begins searching for all of your public pages.
- This page could not be found.-This is also known as Error 404, and it is one of the most common mistakes that webmasters and visitors encounter. It simply signifies that the crawler attempted to crawl a page that has since been deleted or moved elsewhere on your domain. This can also be caused by a temporary URL. To avoid this type of issue, there is a terrific application that can assist you in locating these pages: URL parameter tool for webmasters.
- Robot Access is Restricted-This indicates that the AdSense crawler attempted to crawl a domain or subdomain level page. Simply remove the following two lines from your robots.txt file to allow google crawler access :Allow: / User-agent: Media partners-Google
- Behind a Login's Content-Many websites require login credentials in order to gain premium access to the site's core content. It usually signifies that a crawler login for that premium material hasn't been set up.
- For a site I don't manage, you have Ad Crawler Errors. This warning indicates that your ad code is being used on another website without your authorization. The impressions and clicks will be counted, but no money will be paid. As a result, you will not be paid because this is not legal. Under Settings > Accounts and Authorization, you'll see this option. In 48 hours, expect to see changes.
Read More: How do I fix the Google “No Crawl” issue?