How to clear Twitter card cache with api?


I am looking for way to clear of twitter card cache using api

I am aware of this tool of Twitter helps in clearing the cache of the cards

I am looking for a way to integrate this process into my application. What I am looking for is something similiar to what facebook offers (

I did try a simple code like below which happens in the twitter’s tool. but I get 403 authorized. Is there a way to to authorize the below request ?

                    "url" : "",
                    "platform" : "Swift-12",
                    "authenticity_token" : "tkWubiFOndkChH58oJmophrLlVoQqbQmY3QZFTayFK6uq"


Any help is appriciated


There is no API for clearing the card crawler cache, and the card validator is not intended to be used in this manner. Simply re-submitting the same URL to the validator will not refresh the cache, so this would not work.


Hi. This is an API endpoint that I also need to use. I’m writing a program that’ll help people update their twitter feeds and sometimes the cached content doesn’t match what’s being shared. How can we programmatically clear the Twitter cache of a URL?


There is no way to do this.


Thanks for the reply. Does Twitter automatically flush the cache every few days?


Yes, there’s a re-crawl roughly every seven days - this is covered on the Troubleshooting page.


This has been an issue with twitter for years, are there any plans on implementing functionality that would allow users to programmatically invalidate / update / the cache / force re-scrape the meta tags?


There are some suggestions for how to do this in the troubleshooting docs, but there are no additional features planned that I am aware of at this time.


Thanks for your reply!

I can’t find any information on how to programmatically update the cache @ twitter in the link you provided, can you tell me more precisely where to find this info?

(Creating a is not a solution since the full URL is still cached and users tweeting that URL will still show old data)


It’s not a programmatic solution unfortunately, but the guidance in the troubleshooting page suggests adding junk parameters to the end of the URL ( or otherwise) to cause the crawler to revisit the page. This is not intended as a way to regularly update the images, just to assist you while troubleshooting - the crawler delay and cache are not designed to be dynamically updated.