Fetching the page failed because it's denied by robots.txt


#1

Hi there,

I get this error (message) even if I don’t use robots.txt.
Where to look for the problem?

CMS — OpenCart


#2

Can you share a URL and/or a screen shot?


#3

Of course, for example: http://chesom.ru/cheese/letivaz (product page)

But right now I use robots.txt — http://chesom.ru/robots.txt

On main page/URL of site I use redirection… Can it be a reason of error?


#4

Try putting the Twitterbot first?

User-agent: Twitterbot
Allow: *


#5

Unfortunately no result…
Can it be because of the construction of the path (URL) by CMS?


#6

Somewhat out of ideas. I wonder if the robots.txt error is a bad message?

Have you checked out our troubleshooting page?

https://dev.twitter.com/cards/troubleshooting


#7

Hm…
I think about .htaccess file or about blocking web crawlers by my CMS (OpenCart)…

Interesting that if to check URL/domain — http://chesom.ru — validation passes :slight_smile:


#8