Rate limit on follower list



I’m writing an app that does some housekeeping on your list of followers and, to do this, need to load your entire follower list. To be thorough with my testing I grabbed the follower list of a famous person who has over 3m followers and boom! I ran right into the 350 requests per hour limit. This makes sense, of course, because 3m / 5k = 600 requests, which is way over the limit.

Even if I do my housekeeping on each 5k user block, the processing is going to be done in the blink of an eye and I’ll run into the rate limit again. Forcing the user to wait an hour to complete this work is not practical or desirable.

I thought about caching the list (which I do anyway, but at some point I have to get the latest list from Twitter so that it’s up to date, and I’ll run into the same problem). There doesn’t seem to be a way of asking Twitter for just the ‘new followers since this date’, so I can’t gradually build my list over time (because there also appears to be no ‘this person has unfollowed you’ API call, so I could never prune the user list). Although the API currently returns the followers list ordered with most recent first, the docs state to not rely on this as it may change at some point in the future, and that wouldn’t solve the problem of removing people that no longer follow you.

Does this mean, then, that for users over a certain number of followers, they’ll be unable to use this app? Or am I missing something?

I’m new to the Twitter API, so any help would be greatly appreciated!



Apologies for bumping this, but I still haven’t had any brainwaves on how to handle people who have millions of followers.

Or should the approach I take be: Twitter accounts with that many followers are a tiny percentage of users, so it’s just tough luck for them?

Any thoughts would be greatly appreciated!



Hi @FSPDev,

I think you’ve thought through the issue sufficiently here – you’ll just have to spend more time processing results for users with large social graphs. The API is optimized for more typical scenarios in this case. API v1.1’s rate limiting is a little different in this regard, but “time” is still a critical element to doing any kind of deep analysis on large graphs.

While we discourage over-reliance, the ordering of the follow graph methods will remain the same for the foreseeable future.


Thank you - I really appreciate the reply.

One other question - the ‘hour’ used in the rate limit. Is this clock based? What I mean by that is does the 350 request limit apply between the hours of 3 and 4 and then, at 4, the number of requests made counter resets to zero. At 5, the same thing happens, and so on?

Or is it some sort of rolling hour that’s tracked?


no entiendo nadaaaaaaaaaaaa