Hello,
I would like to pass off a tweet location to the geo/id/:place_id API on R, but I am continually getting the error 215.
I have followed Twitters documentation to no avail and wanted to see what I was missing.
require(httr)
require(jsonlite)
require(dplyr)
bearer_token ← Sys.getenv(“WORKING TOKEN HERE”)
headers ← c(Authorization = sprintf(‘Bearer %s’, bearer_token), )
response ←
httr::GET(url = “The twitter URL /1.1/geo/id/df51dec6f4ee2b2c.json”,
httr::add_headers(.headers = headers)",
httr::add_headers(.headers = headers)
)
obj ← httr::content(response, as = “text”)
print(obj)
I am continually getting this error.
[1] “{"errors":[{"message":"Bad Authentication data","code":215}]}\n”
What have I missed?
The authentication is wrong for this endpoint GET geo/id/:place_id | Docs | Twitter Developer Platform you need to use API key and secret (also called consumer keys) and access token and secret.
Separately - I’m wondering why you need to do this? Are you trying to fill in “missing” data after retrieving tweets from the search endpoint? If you kept the original responses, the place details from expansions are inside includes portion of the response, you generally do not need to request them all again to get the details, unless you’re missing all the right fields in the first place.
Thanks Igor,
I have run a package (academictwitteR) that will get me the following fields.
id,text,author_id,geo.place_id,lang,created_at
Now, as I have the geo.place_id, I want to dynamically get the actual user location from that json.
For example,
“full_name”:“Willunga, South Australia”,
I want to put this field against the other fields to map rough user location.
Does this make sense?
There is no need to do that if you have the original responses from the API, i think it’s an issue with the library still extraction of $places from jsons · Issue #175 · cjbarrie/academictwitteR · GitHub so it’s possible to extract it from the search results without having to make more API calls
Thanks for highlighting that.
I am still looking for an automated way to get location, and if this library won’t assist then would you know another way?
I have over 2M tweets to analyse, with more on the way, each with a geo.place_ID.
I’m still not clear one something with that R script - Are you keeping the original json responses somewhere? If so, the data is probably already there and you can extract it without more API calls, if geo wasn’t specified in expansions, the data would be missing and you would have to redownload either the tweets by ID or the places with that other API.
Hi Igor,
I’ll post my solution for those whom might come across this thread.
require(httr)
require(jsonlite)
require(dplyr)
bearer_token ← “BEARER_TOKEN_HERE”
headers ← c(Authorization = sprintf(‘Bearer %s’, bearer_token))
geoLoc = “0073b76548e5984f.json”
url_handle ←
sprintf(‘1.1/geo/id/%s’, geoLoc)
response ←
httr::GET(url = url_handle,
httr::add_headers(.headers = headers)
)
obj ← httr::content(response, as = “text”)
print(obj)
recent_search_body ←
content(
response,
as = ‘parsed’,
type = ‘application/json’,
simplifyDataFrame = TRUE
)
recent_search_body$full_name
2 Likes