Historical Powertrack - running estimates (job quotes) - maximum per day?


#1

Hi guys,

I’m not entirely sure about how the GNIP counts the maximum number of estimates per day for Historical Powertrack.

The reference suggests:
Rate Limits
A maximum of 60 Jobs can be created per (UTC) day.
A maximum of 30 Jobs can be created per hour.
A maximum of 2 Jobs can be estimating concurrently.
A maximum of 2 Jobs can be running concurrently.
http://support.gnip.com/apis/historical_api2.0/api_reference.html

Now, this does not seem to be true. If i hit 60 estimates, i always have to wait 24h from the moment I’ve hit the limit and then it sort of “frees up” an estimate “slot” when 24h pass from the first next estiamate.

Example
If you do a query at

  • 1 query at 12pm
  • 5 queries at 1pm
  • 20 queries at 2pm
  • 4 queries at 3 pm
  • 30 queries at 4pm

you can not do another query until 12pm the next day and then 1 is freed up (so you can only do 1 query until 1 pm when 5 are freed up).

Could anybody clarify what’s going on and why in practice the reference is not being followed by GNIP, or it’s not clarified completely, or I’m just missing something when reading through it? :slight_smile:

p.s. I specifically only estimates as jobs, and not accepted jobs.

Best regards!


#2

Historical PowerTrack estimates should not be used in this way, and its not the intended use case for the product. This is also against the terms of service. I would highly discourage this usage ongoing.


#3

Alright, will do. Thank you


#4

Could you please let me know then what is the alternative, how to get an estimate and develop a query string before harvesting?


#5

Hello @je_nicolo,

It depends on which of our products you have at your disposal. I suggest that you try one of our search APIs, but do keep in mind that there are certain operators that work with Powertrack products and not Search. You can find a list of our operators by-product here:
https://developer.twitter.com/en/docs/tweets/rules-and-filtering/overview/operators-by-product


#6

Hi,

Thanks LeBraat! Could you please help by answering the following questions also :slight_smile:
Do I have a default access to Search API as a subscriber of the Historical Powertrack API?
How do I gain access to the Search API?
Can I estimate the historical volume of tweets using the Search API?

Best regards


#7

While you do have free access to the Standard Search API, you will have to apply for access to the Premium Search APIs and talk to an account manager about the Enterprise Search APIs.

You can find a breakdown of the different products, and details on the Standard API here:
https://developer.twitter.com/en/docs/tweets/search/overview

Apply for Premium access here:
https://developer.twitter.com/en/premium-apis

And Enterprise here:
https://developer.twitter.com/en/enterprise

Here are some details from our documentation on pulling estimates using our Enterprise level APIs:

Full-Archive Search provides a ‘counts’ endpoint that is used to generate a minutely, hourly, or daily time-series of matching Tweets. For use cases that benefit from knowing about data volumes, in addtion to the actual data, the Full-Archive Search ‘counts’ endpoint is the tool of choice. Note that the ‘counts’ endpoint is a measure of pre-compliant matched Tweets. Pre-compliant means the Tweet totals do not take into account deleted and protected Tweets. Data requests will not include deleted or private Tweets.

The Historical PowerTrack API provides an order of magnitude estimate for the number of Tweets a Job will match. These estimates are based on a sampling of the time period to be covered, and should be treated as a directionally accurate guide to the amount of data a historical Job will return. A Historical PowerTrack estimate will help answer whether a Job will match 100,000 or 1,000,000 Tweets. The goal is to provide reasonable expectations around the amount of data a request will return, and the Historical PowerTrack API should not be used as an estimate tool.

We do currently have a counts endpoint for the 30 day Premium API, but have yet to release a Full Archive Premium version.


#8

Thank you!
The apply pages are useful!