4

I am having a problem... does anyone knows why this code hangs in the while loop. The loop doesn't seem to catch the last line of the stdout.

working_file = subprocess.Popen(["/pyRoot/iAmACrashyProgram"], stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.PIPE) line = working_file.stdout.readline() working_file.stdout.flush() while working_file != "" : print(line) line = working_file.stdout.readline() working_file.stdout.flush() 

The script hangs with the curser just blinking when readline() is encountered. I don't understand why. Can anyone shed some light?

5
  • 1
    Going on a guess here, but I've found that p.stdout.read*() will always be a blocking call, and if there isn't any data to return, then it keeps blocking. A nonblocking read may help you out. Commented Dec 13, 2011 at 20:43
  • at the risk of sounding like a massive idiot could you explain what you meen by nonblocking read, thanks :) Commented Dec 13, 2011 at 20:46
  • 1
    read() will hang if there's no data to read. It will only return once there is data (or enough of it) to return. A nonblock read causes it to return data immediately, and if there is none, then no data will be returned. Thus it doesn't hang (or, it's not a blocking call: nonblocking call). Commented Dec 13, 2011 at 20:49
  • Does the program want input on stdin? Commented Dec 13, 2011 at 21:15
  • im not so hot on program io and streams etc, anyone know where i could find a place to read up on the concepts, i still dont quite understand what flush() does and why Commented Dec 13, 2011 at 21:40

2 Answers 2

7

Does doing a nonblocking read help you out?

import fcntl import os def nonBlockReadline(output): fd = output.fileno() fl = fcntl.fcntl(fd, fcntl.F_GETFL) fcntl.fcntl(fd, fcntl.F_SETFL, fl | os.O_NONBLOCK) try: return output.readline() except: return '' working_file = subprocess.Popen(["/pyRoot/iAmACrashyProgram"], stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.PIPE) line = nonBlockReadline(working_file.stdout) working_file.stdout.flush() while working_file != "" : print(line) line = nonBlockReadline(working_file.stdout) working_file.stdout.flush() 

I'm not sure exactly what you're trying to do, but will this work better? It just reads all the data, instead of reading only one line at a time. It's a little more readable to me.

import fcntl import os def nonBlockRead(output): fd = output.fileno() fl = fcntl.fcntl(fd, fcntl.F_GETFL) fcntl.fcntl(fd, fcntl.F_SETFL, fl | os.O_NONBLOCK) try: return output.read() except: return '' working_file = subprocess.Popen(["/pyRoot/iAmACrashyProgram"], stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.PIPE) stdout = '' while working_file.poll() is None: stdout += nonBlockRead(working_file.stdout) # we can probably save some time and just print it instead... #print(stdout) stdout = stdout.splitlines() for line in stdout: print(line) 

Edit: A generalized script which should be more suited for your use case:

import fcntl import os def nonBlockRead(output): fd = output.fileno() fl = fcntl.fcntl(fd, fcntl.F_GETFL) fcntl.fcntl(fd, fcntl.F_SETFL, fl | os.O_NONBLOCK) try: return output.read() except: return '' working_file = subprocess.Popen(["/pyRoot/iAmACrashyProgram"], stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.PIPE) while working_file.poll() is None: stdout = nonBlockRead(working_file.stdout) # make sure it returned data if stdout: # process data working_file.stdin.write(something) 
Sign up to request clarification or add additional context in comments.

4 Comments

You get an A for effort. I hope this works. (Thanks by the way, I learned quite a bit from this.)
wow, ill try it out, i cant say i fully understand it all though:) Im actually trying to write a python script that launches an external process and listens to it, then based on its output writes some values to its stdin. im a little out of my depth though, i feel like there must be a more elegant way than what im trying, you live and you learn :)
Based on your use case, it's possible that the blocking may have nothing to do with your issue, but hopefully this'll still help out. I added a script more suited for your problem, however it's still nonblocking (but that can easily be fixed with working_file.stdout.readline()).
Actually yes. Apparently fcntl is Unix only. The equivalent to fcntl is win32api, but apparently it's non trivial to port it... :/
0

In addition to the previous answers, there is another case which might be blocking your stdout.readline(). I see you have redirected both stdout and stderr to subprocess.pipe but you're only ever reading stdout. The stderr is never read.

If your program throws enough errors to fill up the buffer (which has a default size of 8192 bytes), and the program still needs to write errors, the child process will wait until the buffer is empty, which in this case is never. The stdout.readline() further completes the deadlock. Consider this scenario:

  1. you read the last line in stdout using stdout.readline()
  2. subprocess writes stderr and fills up the buffer
  3. you attempt the next stdout read
  4. readline() is blocked since it is waiting for data to be written into stdout, which in turn is waiting for stderr to be written, which is further waiting for the buffer to be emptied.

If you have nothing to do with stdout, simply flush it out before readline() or pass it to /dev/null instead of a pipe.

working_file = subprocess.Popen(["/pyRoot/iAmACrashyProgram"], stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.DEVNULL) 

otherwise if stderr is imp, you can handle it separately using:

if stderr.readline(): do something 

You could also spawn a thread to read stderr, or perform a a non blocking read as suggested above.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.