Fetching the page failed because it's denied by robots.txt


Hi there,

I get this error (message) even if I don’t use robots.txt.
Where to look for the problem?

CMS — OpenCart


Can you share a URL and/or a screen shot?


Of course, for example: http://chesom.ru/cheese/letivaz (product page)

But right now I use robots.txt — http://chesom.ru/robots.txt

On main page/URL of site I use redirection… Can it be a reason of error?


Try putting the Twitterbot first?

User-agent: Twitterbot
Allow: *


Unfortunately no result…
Can it be because of the construction of the path (URL) by CMS?


Somewhat out of ideas. I wonder if the robots.txt error is a bad message?

Have you checked out our troubleshooting page?



I think about .htaccess file or about blocking web crawlers by my CMS (OpenCart)…

Interesting that if to check URL/domain — http://chesom.ru — validation passes :slight_smile: