I have a program, which collects various log-files from multiple computers into a single ZIP-file on the client machine. It is currently using Fabric and I'd like to rewrite it with AsyncSSH to make all the servers send their files in parallel.
What I'm trying to figure out is getting the files over -- as streams. My current Fabric-based code is:
result = io.BytesIO() connection.get(remote = path, local = result) result.seek(0) return result What's the equivalent for when the connection is a result of asyncssh.connect()?
echoin the example. So just execute acatof the log file. You can fire multiple requests in parallel using asyncio.gathet or .wait (see here)result = io.StringIO(await conn.run('cat logfile"), but in parallel usinggather(assuming your logfiles are text and aren't huge)zipfile-- in chunks...