access limitation


dear all,

sorry if this already was asked more than a thousand times - but unfortunately we have to react very quick here, so i´s appreciate if you could help
me with a compact answer…
we have to build an online application containing a continuosly refreshing twitter stream showing tweets from various users and tweets with different hashtags. currently, i´ve combined two search request (group of users + hashtags) pushing their results to an array and deploying it randomized to a container afterwards. the stream pulls its dataset from “” and should be refreshed every three to five seconds…

up to the issue:
we might end up in the situation where we have a huge number of online visitors during our peak time.
let´s say we have round about 20.000 clients, refreshing every five sec. two requests for approx. two hours. which leads to
57.600.000 requests in total (2880 requests per client, as ((5*2)*12) * 120); unfortunately, it might be the case that
round about 3000 users might share the same IP range…

will we face any issues with that?

are there alternatives which can be used from a pure ajax application?

looking forward to any helpful answer,
thanks a lot in advance

best regards


A pure AJAX application will be difficult to leverage for an integration like this – the best you can hope for is being able to continue making API requests in client-side requests and that the end-user will have plenty available to them. Robust integrations would use server-side APIs, likely a combination of the Streaming and REST APIs.



so you suggest that it makes more sense to write a php pulling data from the API
and let the clients get access to the result? do you think it is feasible to do a
cron job for the script, let the script write an XML and let the clients fetch this
xml file instead?

thx again