Since Python 3.5 introduced async with the syntax recommended in the docs for aiohttp has changed. Now to get a single url they suggest:
import aiohttp import asyncio async def fetch(session, url): with aiohttp.Timeout(10): async with session.get(url) as response: return await response.text() if __name__ == '__main__': loop = asyncio.get_event_loop() with aiohttp.ClientSession(loop=loop) as session: html = loop.run_until_complete( fetch(session, 'http://python.org')) print(html) How can I modify this to fetch a collection of urls instead of just one url?
In the old asyncio examples you would set up a list of tasks such as
tasks = [ fetch(session, 'http://cnn.com'), fetch(session, 'http://google.com'), fetch(session, 'http://twitter.com') ] I tried to combine a list like this with the approach above but failed.
results = loop.run_until_complete(tasks)I get a runtime error.async withis such a new feature with so little literature that it would be super convenient for people learning to use it if theaiohttpdoc showed an example of grabbing more than one url. The library looks terrific, just needing a bit of hand-holding to get started. Thank you!