ERROR: Fetching the page failed because it's denied by robots.txt


#1

Hi everybody,

I am trying to get my cards validated to go live next week, but i keep getting the same error from the validator. The first time i tried my robot.txt wasn’t right. Fixed this by giving Twitterbot full access and tried again 48 hours later. Nothing changed! I tried on another domain with exactly the same robots.txt and this worked instantly!

Is there somthing i’m missing or is it being cached longer than 48 hours!?

This is my robots.txt:

User-agent: Twitterbot
Disallow:

User-agent: *

Disallow: /async/
Disallow: /browser/
Disallow: /includes/
Disallow: /javascript/
Disallow: /styles/

Thanks in advance!


#2

Try changing to:

User-agent: Twitterbot
Allow: *

followed by the other content for any User-Agent.


#3

Thanks for reacting!

I just tried it with the new robots.txt and it still does not work. Do i have to wait another 24 hours to test this, or is there another way to test?

Thanks!


#4

Fixed it! I wasn’t paying attention. Found another robots.txt in the absolute root of the website wich was blocking the cards!

Thanks anyway ePirat!