We are periodically experiencing high traffic to our website, generated by user-agents identifying themselves as ‘Twitterbot/1.0’. These requests are numerous enough that they effectively act as a DDOS, occasionally slow down webserver-response-time to tens of seconds (even when using multiple levels of caching). Sometimes the request-groups contain many repetitions of the same URL, sometimes they contain many different URLs.
The IP-addresses involved are
199.16.156.124
199.16.156.125
199.16.156.126
199.59.148.209
199.59.148.210
199.59.148.211
As I understand it TwitterBot fetches the page-contents in order to compose Twitter Cards, and we would like to allow that, but this is a bit much. Is there a way to get the Twitter servers to slow down their requests?
Note: the robots.txt-file contains a line
Crawl-delay: 10
but this appears to be ignored.
I will try to post a log of two minutes of Twitter requests as a reply to this message as an illustration.