问题描述
我正在尝试使用基于日期范围的变量在查询 REST API 时应用过滤器。
示例输出:
"data": [
{
"assignment_count":,"business_hour_interruptions":,"created_at": "","description": "","engaged_seconds":,"engaged_user_count":,"escalation_count":,"id": "","incident_number":,"major":,"off_hour_interruptions":,"priority_id": "","priority_name": "","resolved_at": "","seconds_to_engage": null,"seconds_to_first_ack":,"seconds_to_mobilize": null,"seconds_to_resolve":,"service_id": "","service_name": "","sleep_hour_interruptions":,"snoozed_seconds":,"team_id": "","team_name": "","urgency": "","user_defined_effort_seconds": null
}
],"ending_before": null,"filters": {
"created_at_end": "","created_at_start": ""
},"first": "","last": "","limit":,"more":,"order": "","order_by": "","starting_after":,"time_zone": ""}
到目前为止我做了什么:
-
我尝试通过迭代生成日期:
start_date = date(2020,01,01) end_date = date.today() def daterange(start_date,end_date): for n in range(int((end_date - start_date).days)): yield start_date + timedelta(n) for single_date in daterange(start_date,end_date): data = single_date.strftime("%Y-%m-%d %H:%M:%s")```
-
独立查询:
def raw_incidents():
limit = 1000
API_KEY = ''
start_date = datetime(year=2020,month=01,day=01).strftime("%Y-%m-%d %H:%M:%s")
today = date.today().strftime("%Y-%m-%d %H:%M:%s")
url = f'https://api.pagerduty.com/analytics/raw/incidents?limit={limit}'
headers = {
'X-EARLY-ACCESS': 'analytics-v2','Accept': 'application/vnd.pagerduty+json;version=2','Authorization': 'Token token={token}'.format(token=API_KEY),'Content-type': 'application/json',}
payload = {
"filters": {
"created_at_start": f'{start_date}',"created_at_end": f'{today}'
}
}
r = requests.post(url,headers=headers,data=json.dumps(payload))
data_json = r.json()
incidents = pd.json_normalize(data_json['data'])
incidents.to_csv('incidents_raw.csv',index=None,header=True)
if __name__ == '__main__':
raw_incidents()
这就是我卡住的地方,我想根据日期数组使用不同的过滤器(动态 created_at 和 created_end)运行上述查询。如何根据日期范围迭代上述查询并附加结果?
提前致谢。
解决方法
暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!
如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。
小编邮箱:dio#foxmail.com (将#修改为@)