ERROR: Fetching the page failed because it's denied by robots.txt

chrome
robots
cards

#1

Hey, guys.

I’m trying to make my twitter cards work, but not having any succes.

My robos.txt:
User-agent: Twitterbot
Allow: *

User-agent: *
Disallow:

The url i’m using to test: http://jessiebat.com/blog/23/

I tried everything I saw in the similar topics, but nothing seems to work. Can anybody help me?


#4

Hi @jessie_bat - when I look at http://jessiebat.com/robots.txt I see only:

User-agent: *
Disallow:

I’m not seeing the Twitterbot reference allowing access, unfortunately. It looks like that is the reason the validator cannot see your cards.


#5

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.