Best way of tracking number of tweets about 400-600 different urls?


#1

Hi, I’m working on an application that should track for 400-600 urls how many times they appeared in a tweet in the last 2 days.
I know that i can make at most 180 search calls per 15 minutes. That means that i can update the status of my tracked urls every 50 minutes via search calls.

I would want to bring the delay down to around 10-15 minutes to track them. I could run the queries from different servers with different accounts but maybe there is a better solution or some build in api feature in the streaming api?


#2

Well i guess the best way is to develope an intelligent ranking system to monitor urls that have high sharing activity on twitter more frequently then the ones that have low activity. Reducing the number of uneccessary search calls. I will see how my prototype behaves in the next days.

I tried tracking the “hottest” links live with the streaming api, but using the streaming api instead of searches doesn’t seem to be easy because the urls are often to long to be accepted as tracking tearms. I could track domains but i guess this would be useless because it would miss shortened urls. I tried shortening my urls with bitly before tracking but twitter doesn’t seem to recognize them and does not resolve them to the real ulr so i would miss all tweets with other shortened aliases.

Anyone has any good ideas on how i could solve this in a simpler fashion?