Hey twitter devs,
I am wondering what sort of options are available for those of us integrating Twitter REST data into custom analytics engines? The new Rate Limits of REST v1.1 seem to be very well tuned for a user-centric twitter app and the new rates and limits are completely understandable from that point of view. If, however, the application requires to inspect a larger scope from the twitter network, based on the interconnectedness of its users, it quickly runs into problems with Rate Limiting.
I understand that it is a good exercise and practice to try to optimize your API calls with good code and logic, and this should be done along with caching. To build a decent network size in a realistic amount of time (required for proper analytics) demands access to more than the 15 calls / 15 min, available on many API functions in the REST v1.1. What does Twitter have in mind for the less user-centric applications of the twitter network requiring the detailed data only provided by the REST API functions?
In several discussions, it has been hinted that for applications requiring higher volume REST limits should consult with some of the Certified Twitter Products for access. The collection of Certified Twitter Products Program seems to largely indicate services derived from the Streaming API, rather than larger the REST. Who is then responsible for heftier REST API request volumes?
I am currently in development of the analytics engine, and would like to know whether I should be pursuing alternative means to discover my network data. I am also curious to know how Twitter views possible analytics products newly in development.
Thanks in advance for any help or direction