Googlebot can not access your site file robots.txt fetch errors

Over the past 24 hours, 1 error when Googlebot tries to access your robots.txt. To ensure that we do not crawl any pages listed in the file, we postpone the crawl. Overall error rate robots.txt your site is 100.0%.

Warning over it usually appears when email to us at a time when google bot can not access our website?

Do you know why?.It was in because ip google bot can not crawl your website on google because of blocked ip in server hosting.

How to Fix it?

Tell it my blog url to your host provides guaranteed understood or if you have the server itself to do this:

Sign in to CSF: Config Server Firewall - All Blocked Flush (Automatic all blocked ip firewall will be deleted from the system) . Well at first flush event Myspace someday be again, though not always.[source]

 
Informasi-Informasi Saja Copyright © 2009 - 2013, Designed by Bie Themes