I am trying to request a bunch of URLs concurrently however the URLs are built from a list. Currently I am looping over the list and (I think) adding them to the queue as it happens. It is definitely 10x faster than requests.get, however I am not sure I am doing it correctly and so it can be optimized. I profiled it and noticing it is still locking 90% of the time after the concurrent requests are done i.e start -> 10+ concurrent requests -> lock for 5 seconds or so -> done
Additionally, this code results in a Unclosed client session message at the end. Any idea why? Pretty sure this is using a context manager properly.
I have searched and not found this exact question
import signal import sys import asyncio import aiohttp import json import requests lists = ['eth', 'btc', 'xmr', 'req', 'xlm', 'etc', 'omg', 'neo', 'btc', 'xmr', 'req', 'xlm', 'etc', 'omg', 'neo'] loop = asyncio.get_event_loop() client = aiohttp.ClientSession(loop=loop) async def fetch(client, url): async with client.get(url) as resp: assert resp.status == 200 return await resp.text() async def main(loop=loop, url=None): async with aiohttp.ClientSession(loop=loop) as client: html = await fetch(client, url) print(html) def signal_handler(signal, frame): loop.stop() client.close() sys.exit(0) signal.signal(signal.SIGINT, signal_handler) tasks = [] for item in lists: url = "{url}/{endpoint}/{coin_name}".format( url='https://coincap.io', endpoint='page', coin_name=item.upper() ) print(url) tasks.append( asyncio.ensure_future(main(url=url)) ) loop.run_until_complete(asyncio.gather(*tasks))