How long does Twitter cache the robots.txt?


#1

I modified the robots.txt on lockergnome.com to allow Twitterbot to retrieve images needed for Twitter cards, but it still errors out with a DENIED_BV_ROBOTSTXT error, probably because it’s caching the robots.txt file. How long does Twitter cache this for? And is there a way of forcing it to refetch it?


#2

Looks like this didn’t get answered yet, and I’m having the same problem. Did it eventually work for you, Yoast?


#3

Yeah it did. Probably caching for 24 hours or so.


#4

Cool. I set my robots.txt file like so:

User-agent: Twitterbot
Disallow:

User-agent: *
Disallow: /go/

Hopefully that will work.


#5

Currently the cache is about 24 hours, indeed.


#6

It would be great if using the Twitter Card Preview Tool (https://dev.twitter.com/docs/cards/preview), refreshed or otherwise skipped any cacheing. This is exactly what Facebook’s debugger does (https://developers.facebook.com/tools/debug).


#7

I agree, disabling the cache for the preview tool would be great


#8

Yeah, the heavy caching on robots.txt is pretty annoying :-/


#9

Agreed. This is incredibly frustrating. Even if there was a simple “Clear robots.txt Cache” button that we could only use a few times every 24 hours, that would be sufficient. Anyone dealing with this issue and trying to debug it (especially less technical users who don’t understand what a ‘cache’ is) would likely give up out of frustration after 24 hours.


#10

Yes, I’m having the same problem.Please help me.


#11

Hi @twitter I am having exactly the same problem. I double checked everything for my robots.txt and found that nothing was blocking the page. It would be great if as @regisgaughen said that we could skip caching altogether.