Skip to main content

36 Messages


3.7K Points

Mon, Jul 23, 2018 9:31 PM


Robots.txt fetch

One of my sites on the Google webmaster tool site shows no errors during crawls. Another site receives this message.

"Google couldn't crawl your site because we were unable to access your site's robots.txt file."

"Your server returned a 5xx (unreachable) error when we tried to retrieve your robots.txt file"

According to the More Info page: 
"If your site shows a 100% error rate any of the three categories, it likely means that your site is either down or misconfigured in some way."

This site showed a green and good for DNS and server connectivity but a 100% error for robots.txt. What is robots.txt and is there anything I can do to resolve the problem or is it something that has to be handled on your end?

According to Google:
Fixing robots.txt file errors
  • You don't always need a robots.txt file.
    You need a robots.txt file only if your site includes content that you don't want search engines to index. If you want search engines to index everything in your site, you don't need a robots.txt file—not even an empty one. If you don't have a robots.txt file, your server will return a 404 when Googlebot requests it, and we will continue to crawl your site. No problem.
  • Make sure your robots.txt file can be accessed by Google.
    It's possible that your server returned a 5xx (unreachable) error when we tried to retrieve your robots.txtfile. Check that your hosting provider is not blocking Googlebot.  If you have a firewall, make sure that its configuration is not blocking Google.

This conversation is no longer open for comments or replies and is no longer visible to community members.


No Responses!