I am trying to create a tailored audience based on a list of hashed emails.
It seems to work with small lists (they’re showing as processing but no errors showed up), but if they’re big (> 60mb at least, one at 90mb is definitely failing) I am tending to get the below error.
As I understand it, twitter_ads creates chunks from the file I provide then uploads them one at a time? Is it trying to do this too quickly? I am using time.sleep() to space out requests to create lists, but is there something I need to do within the individual requests to make them take their time?
Any advice appreciated, thanks
Traceback (most recent call last): File "G:\Decision_Science\Scripts\twitterTailoredAudience.py", line 58, in <module> audience = TailoredAudience.create(account, name+'twithash.csv', name, TA_LIST_TYPES.EMAIL) File "C:\Python27\lib\site-packages\twitter_ads\audience.py", line 31, in create getattr(audience, '__update_audience__')(upload.perform(), list_type, TA_OPERATIONS.ADD) File "C:\Python27\lib\site-packages\twitter_ads\http.py", line 260, in perform self.__upload_chunk(location, chunk_size, bytes, bytes_start, bytes_read) File "C:\Python27\lib\site-packages\twitter_ads\http.py", line 315, in __upload_chunk domain=self._DEFAULT_DOMAIN, headers=headers, body=bytes).perform() File "C:\Python27\lib\site-packages\twitter_ads\http.py", line 72, in perform raise Error.from_response(response) twitter_ads.error.RateLimit: <RateLimit object at 0x3d75878 code=429 details=None