How long does Twitter cache the robots.txt?


I modified the robots.txt on to allow Twitterbot to retrieve images needed for Twitter cards, but it still errors out with a DENIED_BV_ROBOTSTXT error, probably because it’s caching the robots.txt file. How long does Twitter cache this for? And is there a way of forcing it to refetch it?


Looks like this didn’t get answered yet, and I’m having the same problem. Did it eventually work for you, Yoast?


Yeah it did. Probably caching for 24 hours or so.


Cool. I set my robots.txt file like so:

User-agent: Twitterbot

User-agent: *
Disallow: /go/

Hopefully that will work.


Currently the cache is about 24 hours, indeed.


It would be great if using the Twitter Card Preview Tool (, refreshed or otherwise skipped any cacheing. This is exactly what Facebook’s debugger does (


I agree, disabling the cache for the preview tool would be great


Yeah, the heavy caching on robots.txt is pretty annoying :-/


Agreed. This is incredibly frustrating. Even if there was a simple “Clear robots.txt Cache” button that we could only use a few times every 24 hours, that would be sufficient. Anyone dealing with this issue and trying to debug it (especially less technical users who don’t understand what a ‘cache’ is) would likely give up out of frustration after 24 hours.


Yes, I’m having the same problem.Please help me.


Hi @twitter I am having exactly the same problem. I double checked everything for my robots.txt and found that nothing was blocking the page. It would be great if as @regisgaughen said that we could skip caching altogether.