0

I read a lot of the aiohttp related blogs and examples. I think I understand them but when I try to integrate that "knowledge" into the structure of my own application there are some problems. Below is a minimal (not) working example representing this structure.

I assume I have a grounded misunderstanding of how the structure of such a program should look like.

The main problem is the RuntimeError: This event loop is already running. I kind of understand the root of it but do not know how to go around it.

Secondary there are two warnings.

sys:1: RuntimeWarning: coroutine 'wait' was never awaited sys:1: RuntimeWarning: coroutine 'FetchAsync._fetch' was never awaited 

This is the MWE

#!/usr/bin/env python3 # -*- coding: utf-8 -*- import asyncio import aiohttp class FetchAsync: def __init__(self): pass def _get_loop(self): try: loop = asyncio.get_event_loop() except RuntimeError: loop = asyncio.new_event_loop() asyncio.set_event_loop(loop) finally: loop.set_debug(True) return loop async def _receive_via_aiohttp(self, session, url, headers): async with session.get(url, headers=headers) as response: content = await response.read() return response, content async def _fetch(self, url, session): headers = {'User-Agent': 'MyAgent'} # use aiohttp to get feed/xml content and response object response, content = await self._receive_via_aiohttp(session, url, headers) # do a lot more stuff... def run(self): loop = self._get_loop() asyncio.run(self._run_async()) loop.close() async def _run_async(self): async with aiohttp.ClientSession() as session: # in real there are much more URLs urls = ['https://cnn.com', 'https://fsfe.org'] # create the "jobs" (futures) futures = [self._fetch(url, session) for url in urls] # run the "jobs" asynchrone self._get_loop().run_until_complete(asyncio.wait(futures)) if __name__ == '__main__': obj = FetchAsync() obj.run() 

And this is the full error output

Traceback (most recent call last): File "/home/user/share/work/aiotest/./fetchfeeds.py", line 62, in <module> obj.run() File "/home/user/share/work/aiotest/./fetchfeeds.py", line 43, in run asyncio.run(self._run_async()) File "/usr/lib/python3.9/asyncio/runners.py", line 44, in run return loop.run_until_complete(main) File "/usr/lib/python3.9/asyncio/base_events.py", line 642, in run_until_complete return future.result() File "/home/user/share/work/aiotest/./fetchfeeds.py", line 58, in _run_async self._get_loop().run_until_complete(asyncio.wait(futures)) File "/usr/lib/python3.9/asyncio/base_events.py", line 618, in run_until_complete self._check_running() File "/usr/lib/python3.9/asyncio/base_events.py", line 578, in _check_running raise RuntimeError('This event loop is already running') RuntimeError: This event loop is already running sys:1: RuntimeWarning: coroutine 'wait' was never awaited sys:1: RuntimeWarning: coroutine 'FetchAsync._fetch' was never awaited 

1 Answer 1

1

If you had checked Learning asyncio: "coroutine was never awaited" warning error ( last comment ) you would have seen that

Do not use loop.run_until_complete call inside async function. The purpose for that method is to run an async function inside sync context

I've updated your code. The important modification was done in your _run_async() where I replaced the run_until_complete() with an asyncio.gather().

#!/usr/bin/env python3 # -*- coding: utf-8 -*- import asyncio import aiohttp class FetchAsync: def __init__(self): pass def _get_loop(self): try: loop = asyncio.get_event_loop() except RuntimeError: loop = asyncio.new_event_loop() asyncio.set_event_loop(loop) finally: loop.set_debug(True) return loop async def _receive_via_aiohttp(self, session, url, headers): async with session.get(url, headers=headers) as response: content = await response.read() return response, content async def _fetch(self, url, session): headers = {'User-Agent': 'MyAgent'} # use aiohttp to get feed/xml content and response object response, content = await self._receive_via_aiohttp(session, url, headers) return response, content # do a lot more stuff... def run(self): loop = self._get_loop() loop.run_until_complete(self._run_async()) loop.close() async def _run_async(self): async with aiohttp.ClientSession() as session: # in real there are much more URLs urls = ['https://cnn.com', 'https://fsfe.org'] # create the "jobs" (futures) futures = [self._fetch(url, session) for url in urls] print(await asyncio.gather(*futures)) # run the "jobs" asynchrone if __name__ == '__main__': obj = FetchAsync() obj.run() 
Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.