在 python 中使用 aiohttp 获取多个 url

编程入门 行业动态 更新时间:2024-10-27 06:22:00
本文介绍了在 python 中使用 aiohttp 获取多个 url的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧! 问题描述

在之前的问题中,用户建议使用以下方法使用 aiohttp 获取多个 url(API 调用):

In a previous question, a user suggested the following approach for fetching multiple urls (API calls) with aiohttp:

import asyncio import aiohttp url_list = ['api.pushshift.io/reddit/search/comment/?q=Nestle&size=30&after=1530396000&before=1530436000', 'api.pushshift.io/reddit/search/comment/?q=Nestle&size=30&after=1530436000&before=1530476000'] async def fetch(session, url): async with session.get(url) as response: return await response.json()['data'] async def fetch_all(session, urls, loop): results = await asyncio.gather(*[loop.create_task(fetch(session, url)) for url in urls], return_exceptions= True) return results if __name__=='__main__': loop = asyncio.get_event_loop() urls = url_list with aiohttp.ClientSession(loop=loop) as session: htmls = loop.run_until_complete(fetch_all(session, urls, loop)) print(htmls)

然而,这只会导致返回属性错误:

However, this results in only returning Attribute errors:

[AttributeError('__aexit__',), AttributeError('__aexit__',)]

(我启用了,否则它会中断).我真的希望这里有人可以帮助解决这个问题,仍然很难找到 asyncio 等的资源.返回的数据是 json 格式.最后我想把所有的 json dicts 放在一个列表中.

(which I enabled, otherwhise it would just break). I really hope there is somebody here, who can help with this, it is still kind of hard to find resources for asyncio etc. The returned data is in json format. In the end I would like to put all json dicts in a list.

推荐答案

工作示例:

import asyncio import aiohttp import ssl url_list = ['api.pushshift.io/reddit/search/comment/?q=Nestle&size=30&after=1530396000&before=1530436000', 'api.pushshift.io/reddit/search/comment/?q=Nestle&size=30&after=1530436000&before=1530476000'] async def fetch(session, url): async with session.get(url, ssl=ssl.SSLContext()) as response: return await response.json() async def fetch_all(urls, loop): async with aiohttp.ClientSession(loop=loop) as session: results = await asyncio.gather(*[fetch(session, url) for url in urls], return_exceptions=True) return results if __name__ == '__main__': loop = asyncio.get_event_loop() urls = url_list htmls = loop.run_until_complete(fetch_all(urls, loop)) print(htmls)

更多推荐

在 python 中使用 aiohttp 获取多个 url

本文发布于:2023-11-23 09:37:18,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/1620965.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:多个   python   aiohttp   url

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!