Twitter cards once worked and now they say "Unable to render Card preview"


My site has been whitelisted and I’ve had no trouble with Twitter Cards in the past but today I’ve been having problems.

When I attempt to validate the card it says: “Unable to render Card preview” and the log says ERROR: Fetching the page failed because it's denied by robots.txt.

My site is:

I have no access to robots.txt but I’ve checked it and it looks ok.

Older pages that were tweeted with cards are now also no longer validating.

The page I am trying trying to validate now is:

Many thanks for any help you have.


Hey Andy,

Could this be down to an issue with my SSL cert?

I’ve checked online and there may be a problem with that.

Unfortunately I’m not very clued in on these types of things so any assistance would be greatly appreciated.



I’m pretty sure it is not to do with the SSL certificate not being present. I think the robots.txt might be a misleading message too, since it looks like it is set up correctly.

I’m a bit confused as to why when I look through the page, there are 7 instances of twitter:card (some of why specify summary, some which say summary_large_image); and the images referenced do not seem to be accessible.

I’m surprised that you’re saying nothing has changed but that this has worked previously, as from what I’m seeing, there isn’t any code on the page that is likely to work.


How odd!

I am using the Cargo Collective publishing platform (, perhaps the issue lies there.

I would embed the necessary meta tags into each post that I would write. It had been working ok before:

But now it simply displays like this:

The only thing that I can think that has changed is that I’ve continued to add new cards for each article I write. Could it be an issue that I’ve overloaded the site?


I notice that some times cards will show up and some times they will not.

If you search for the domain through twitter you can see this:

I think my robots.txt file is fine but it does have a crawl-delay of 2.

Do you think that might be effecting it?


That’s definitely weird! Unfortunately I’m not familiar with whether the cards crawler (Twitterbot) takes account of the crawl-delay directive. I did just try your site in the validator with URLs that seem to render cards, and others that do not, and in both cases the validator reports that robots.txt is denying access, even though it looks good to me. You could try experimenting by moving Twitterbot to a separate section of the file that doesn’t specify a crawl-delay, or just try removing the directive - either way I can’t explain why some cards render and others do not.