I am trying to implement the summary_large_image card type. I’ve added all of the tags according to twitter’s documentation. My robots.txt file contains the following:
User-agent: Twitterbot
Disallow:
However, it doesn’t seem to matter what’s in the file, I always get the same error from the validator:
ERROR: Fetching the page failed because it’s denied by robots.txt.
There are times when I’ll get a successful response and the card will display, but if i try to validate again (with no changes being made to the page or the robots.txt file) I’ll get the ‘denied by robots.txt’ error again.
I tweeted the URL after I received a successful response from the validator and the card showed up as expected. Now, when I tweet the URL the card does not display at all.
The fact that I received a successful message, didn’t change anything, and then received the error above let’s me know that it’s probably not a problem with my page, but in fact a problem with the validator. I’ve even waited 24 hours (recommended by the twitter staff on another topic - which by the way, is absolutely ridiculous), and still there is no change; so I don’t believe it’s a caching issue either.
Are there any other suggestions? I’ve completed 4 other social network integrations in the time it’s taking me to complete this one, so hopefully someone can imagine my frustration, feel sorry for me, and lend me a hand.
Thanks in advance!