• Currently, I use Asynchronous Analytics (GET stats/jobs/accounts/:account_id) to get the report data.
  • Everyday, I get data for 5 days ago to get new data. But after one month. I re-run get report data for last month, the report data was changed so much (ex: impressions, clicks, spend…)
  • I maybe must get more data for a longer date range every day. So, I want to ask you about the latest date that report data will never updated to set limit for date range I need to get new data.
  • I read document at best-practices but I still misunderstand two below sentences. What do they apply for?

Do not pull data for any entities older than 7 days.

Do not repeatedly query for data that is older than 30 days. This data will not change and should be stored locally.

Thank you very much.

Hi @dudoan1234

The guidance around “do not pull data for any entities older than 7 days” might be better phrased as “campaigns that have stopped serving more than 7 days ago”. The thought is that most spend and impression data has ‘settled’ down by then and would not change.

Generally I do not expect data to be randomly changing EXCEPT for data related to conversions. Conversion data can and will change according to the ‘lookback window’ set by either web_event_tag or app_event_tag tags. You can retrieve the conversion data as a separate metric_group so without requesting for the other data. In very rare cases tweets might get impressions, like if sales members are pulling them up but it should be more like earned impressions not actually serving.

I think the algorithm to fetch data should be based more upon a guess at whether or not the campaign could POSSIBLY have changed stats based upon the status of the campaign: if the end_date was > 1 week ago (and you have already fetched data) that’s the case where I would not re-fetch data. If you are seeing data change in those scenarios, please give specific examples so we can debug why it’s changing.

Thanks,

John

1 Like