Discussing API v1.1


#1

Hi Developers,

I’m sure you’ve read @jasoncosta’s [node:10635, title=“blog post”] announcing the new version of the API. This thread is to collect a few loose ends, upcoming improvements, and known errata.

  • Look for the blue pills that will indicate what version of content you’re looking at. [version:1.1]
  • Check out the [node:10639] for a high-level summary on what’s changed.
  • The entire 1.1 API requires a user context when authenticating at this time with OAuth 1.0A. That means an oauth_token representing an access token must be present in each request. In the near future we’ll have an additional form of authentication for a “userless” or application-only context for many methods.
  • Write operations are considered the same way as they’ve always been in API version 1.0. Accounts have specific allowances for tweeting and sending direct messages and so on and those are still intrinsic to the account and handled similarily. In the near future we’ll have the same per-method rate limiter also handle write operations, allowing for more dynamic & generous limiting on content creation and modification.
  • While the majority of 1.1’s documentation is present, you’ll find us continuing to update much of the documentation throughout the next few weeks. Report any documentation errors you find for API v1.1 [alias:/issues, title=“here”] or in this thread.
  • If you’ve been experiencing rate limiting on widgets at your corporate office or other shared network, you’ll definitely want to check out our [node:10248, title=“new embedded timelines”].
  • API v1.1 has a concept that hasn’t been well-documented yet: “user entities” – these are primarily to identify and resolve t.co links in user objects. For most developers experienced with entities, the meaning of these fields should be clear. Expect explicit documentation on this soon.
  • Some methods may still support a “per_page” parameter when “count” is the parameter we intend. These are few and far between and will be rectified in the coming weeks.
  • Occasionally you may see a documentation page that claims it may represent deprecated information. Unless it’s a 1.0 resource method, consider it a warning and use your own judgement on whether the documentation represents a conflict with the version 1.1 world.
  • Let us know if you see any issues with error response conditions – whether they occur unexpectedly, if the response format is not in JSON, or if the status line and payload is unclear. We’ll be improving the [node:130] documentation to include more information on the better-structured errors you’ll be encountering in 1.1.
  • You may find it difficult to properly tweet with /1.1/statuses/update_with_media at the moment (UPDATE: this has been resolved, use https://api.twitter.com/1.1/statuses/update_with_media.json for this operation)
  • The Search API in 1.1 makes a few references to its deprecated page parameter. It also may occasionally be missing user objects. Fixes forthcoming. (UPDATE: 1.1 Search now has proper retweeted_status objects, user objects, and an improved search_metadata – see [node:513]
  • We’re still updating our [alias:/docs/faq, hash=“rest-api-v11”, title=“API v1.1 FAQ”]. Please keep the questions coming!

The entire platform relations team is excited to work with you on API v1.1, the first major upgrade the API has seen in some time.

Thanks!
Taylor Singletary
@episod


#2

The rate limits for followers/ids and friends/ids really are too low for processing larger Twitter accounts. With this rate limit, one can retrieve a maximum of 300,000 ids per hour, or 75,000 every 15 minutes.

Any chance of this being bumped up to the same rate limit as users/lookup?

It also begs the question, for what duration is a cursor valid? If 15 batches of 5,000 ids are retrieved in Minute-#1 of 15-Minute-Period-#1, will the next-cursor still be valid in Minute-#1 of 15-Minute-Period-#2?


#3

Is it 60 calls per endpoint per hour per user per app?

The reason I’m asking is because if I’m a user and have tweetdeck on my phone, the twitter app on my ipad and something else open on my desktop all querying my timeline endpoint, they all share the 60/hr limit? That’s only 20/hr/app if I’ve got 3 apps, right?

Also, with the streaming API, I seem to be able to connect to the 1.1 api using only username & pw (like with the 1.0 API). Is this a bug or will the streaming API continue to not require OAUTH?


#4

In “GET users/lookup”, you state:

“You are strongly encouraged to use a POST for larger (up to 100 screen names) requests.”

If that’s the case, why even support and document GET? Why not just support and document POST? There’s another API call (GET users/show) if you just want to GET a single user. My guess is that most applications will be using this to acquire large lists of user objects, 100 at a time.


#5

The “recently updated” page contains 7 pages of results for September 4th and 5th, with no indication of what the deltas are:

https://dev.twitter.com/docs/recent

It would be really handy to know how each API endpoint changed. Is it just the rate limit? Or did the payload get some new information? Are there new parameters for the GET & POST? Are there any new endpoints? What endpoints have gone away?

Or are the changes limited to what’s been stated in the overview?

https://dev.twitter.com/docs/api/1.1/overview

Finally, I don’t see any mention of what the new base URL will be: is it “https://api.twitter.com/1.1/”? And is this endpoint live so we can start testing against it?


#6

How can we see how many user tokens we have out? Also, is there a way to proactively expire them from the app side? If we have 120k tokens out there, but only 25% active users then unless the non-active users remove our token on the Twitter.com side, we would be out of luck.


#7

At this time there’s no easy way to determine what your current token count is on an application. We’ll provide an easier way to make this determination in the future.

There’s currently no way to revoke access on behalf of your users.


#8

Consider that every resource in API v1.1 has changed in some way and read the documentation for the resources you make use of. In some cases the response body may have changed slightly, in other cases they have a per-endpoint rate limit defined for them that they did not have before, in some cases the resource URL itself has changed, in some cases previously available methods are no longer existant. Many changes in the API were across the board for all methods that supported entities or retweets (most objects support these). API v1.1 is a mostly new API and should be evaluated with new eyes.

API v1.1 is live. The methods documented in https://dev.twitter.com/docs/api/1.1 are the methods that exist in API v1.1 and can all be found at https://api.twitter.com/1.1/* – except for upload.twitter.com/1.1/statuses/upload_with_media.


#9

This also jumped out at me - it seems like pulling down social graph information has been severely limited for large accounts.

We’ve gone from 1,750,000 followers+following per hour to 75,000 followers + 75,000 following per 15 mins.

Is this correct?


#10

GETs are still easier to perform for many users. There are many use cases for all of our APIs, and while most bulk uses of users/lookup will want to use POST, there are less-heavy usages where the simplicity of a GET may be preferred.


#11

This would be really helpful :slight_smile:


#12

Hi, I have a question about the changes.

I currently have a subscription to the RSS feed http://twitter.com/statuses/user_timeline/bitoclass.rss, which I use to keep a local archive of all my tweets (I never think it sensible to put any data, photos or anything else only in the cloud without a local backup).

So, does your API change only affect developers, or will my end-user RSS subscription cease to work on 5 March 2013 as well?

Thanks!


API v1.1 doesn't limit by IP address?
#13

All of the rate limits are in 15 minute windows. It’s best to think of them within those groups of time instead of as hourly or minutely limits.

Unlike in API v1, all rate limiting is per user per app – IP address has no involvement in rate limiting consideration. Rate limits are completely partitioned between applications.

The Streaming API still allows basic auth. It won’t forever. Use OAuth for better future proofing.


#14

URLs of that style are pretty outdated now. RSS as an output format is considered part of the API, and that URL is what we consider a “version 0” URL. It will also cease functioning on March 5, 2013.

RSS and ATOM formats are relatively unused, and the OAuth requirement for all endpoints make utilizing them more difficult (or nearly impossible) for end-users. Tweets are better represented with the complete metadata available in the JSON responses available to you.


#15

On the whole these changes look pretty good. Thanks for the much friendlier wording the blog posts!

I’m a bit worried about the limitations in loading social graph data. It looks like large accounts will have to load significantly more slowly than before. The rate limitations on users/lookup also means caching will be more important that ever before for loading smaller accounts. Often users/lookup was the bottleneck in loading small accounts.

On the whole though, there are no show stoppers. The rate limiting on search is pretty exciting - making it per account based removes a lot of the uncertainty around per IP limits.

I do have one question at this point. In the Developer Rules of the Road it states “If your application will eventually need more than 1 million user tokens, or you expect your embedded Tweets and embedded timelines to exceed 10 million daily impressions, you will need to talk to us directly about your access to the Twitter API as you may be subject to additional terms.” How do we do this - who should we contact? Is there any information available about the additional terms?

Thanks Taylor.


#16

Thank you for your quick reply, much appreciated!

Pasting the above RSS URL into a ‘subscribe to feed’ box in an RSS reader couldn’t be further from “nearly impossible” for me, and I’m an end-user. I’m not really sure what the tech jargon about “endpoints” etc means but I’m very disappointed you’re turning off this functionality.

So just to be certain I’m understanding this, the consequence is that there will from March be no way for me to subscribe to my own tweets in an RSS reader in order to keep an offline copy of them? This is literally (really) the worst news I’ve ever heard about Twitter.


#17

Hello,

To construct a user’s timeline up to the max 3,200 available using “statuses/user_timeline” 16 calls would be needed. (200 max returned each call) To my understanding this is now limited to 15 calls per 15 minutes. Does this make it impossible to pull down all available Tweets when first encountering a user without hitting the rate limit? What is the guidance here?

Thanks.


#18

Thanks Taylor, looking at it in detail now.

Is the endpoint for OAuth requests still “https://api.twitter.com/oauth/”, or should this be something like “https://api.twitter.com/1.1/oauth”?


#19

Does this mean the end of the IP address-based whitelisting that was previously grandfathered? I’m referring to the 20K/hour rate that older apps might enjoy.


#20

Thanks, Craig. I know it’s a lot to take in and overall it’s pretty similar to 1.0 and the pre-1.0 days, but there are enough subtle differences that I think it’s worthwhile to dive in completely. Happy to help you derive any specific nuances you have questions about.

The process of obtaining authorization and OAuth token negotiation are independent of REST API versioning and you needn’t (and shouldn’t) provide a version element to the path when accessing. Stick with https://api.twitter.com/oauth/* for all of those operations whether using 1.0 or 1.1 API calls ultimately. The access tokens they yield are the same regardless.