subprocess.Popen handling stdout and stderr as they come

python subprocess read stdout while running
subprocess.popen stdout to file
python subprocess popen get stdout and stderr
python subprocess stdout
python subprocess get output
python subprocess output to file
python subprocess output to list
python subprocess long running process

I'm trying to process both stdout and stderr from a subprocess.Popen call that captures both via subprocess.PIPE but would like to handle the output (for example printing them on the terminal) as it comes.

All the current solutions that I've seen will wait for the completion of the Popen call to ensure that all of the stdout and stderr is captured so that then it can be processed.

This is an example Python script with mixed output that I can't seem to replicate the order when processing it in real time (or as real time as I can):

$ cat mix_out.py

import sys

sys.stdout.write('this is an stdout line\n')
sys.stdout.write('this is an stdout line\n')
sys.stderr.write('this is an stderr line\n')
sys.stderr.write('this is an stderr line\n')
sys.stderr.write('this is an stderr line\n')
sys.stdout.write('this is an stdout line\n')
sys.stderr.write('this is an stderr line\n')
sys.stdout.write('this is an stdout line\n')

The one approach that seems that it might work would be using threads, because then the reading would be asynchronous, and could be processed as subprocess is yielding the output.

The current implementation of this just process stdout first and stderr last, which can be deceiving if the output was originally alternating between both:

cmd = ['python', 'mix_out.py']

process = subprocess.Popen(
    cmd,
    stdout=subprocess.PIPE,
    stderr=subprocess.PIPE,
    close_fds=True,
    **kw
)

if process.stdout:
    while True:
        out = process.stdout.readline()
        if out == '' and process.poll() is not None:
            break
        if out != '':
            print 'stdout: %s' % out
            sys.stdout.flush()

if process.stderr:
    while True:
        err = process.stderr.readline()
        if err == '' and process.poll() is not None:
            break
        if err != '':
            print 'stderr: %s' % err
            sys.stderr.flush()

If I run the above (saved as out.py) to handle the mix_out.py example script from above, the streams are (as expected) handled in order:

$ python out.py
stdout: this is an stdout line
stdout: this is an stdout line
stdout: this is an stdout line
stdout: this is an stdout line
stderr: this is an stderr line
stderr: this is an stderr line
stderr: this is an stderr line
stderr: this is an stderr line

I understand that some system calls might buffer, and I am OK with that, the one thing I am looking to solve is respecting the order of the streams as they happened.

Is there a way to be able to process both stdout and stderr as it comes from subprocess without having to use threads? (the code gets executed in restricted remote systems where threading is not possible).

The need to differentiate stdout from stderr is a must (as shown in the example output)

Ideally, no extra libraries would be best (e.g. I know pexpect solves this)

A lot of examples out there mention the use of select but I have failed to come up with something that would preserve the order of the output with it.

Sorry if I misunderstand the question...but if you are looking for a way of having subprocess.Popen output to stdout/stderr in realtime, you should be able to achieve that with:

import sys, subprocess
p = subprocess.Popen(cmdline,
                     stdout=sys.stdout,
                     stderr=sys.stderr)

Probably, stderr=subprocess.STDOUT, may simplify for your filtering?

If that is not what you are/were looking for, sorry. But hopefully it will fit others needs.

subprocess – Work with additional processes, Use check_output() to capture the output for later processing. Messages are sent to standard output and standard error before the By passing different arguments for stdin, stdout, and stderr it is possible to mimic the variations of os.​popen(). The "repeater.py: exiting" lines come at different points in the output for each  The following are code examples for showing how to use subprocess.Popen().They are from open source Python projects. You can vote up the examples you like or vote down the ones you don't like.

I found working example here (see listing of capture_together.py). Compiled C++ code that mixes cerr and cout executed as subprocess on both Windows and UNIX OSes. Results are identitical

Getting realtime output using Python Subprocess, And I want to capture the output and display it in the nice manner with clear formatting. Popen(shlex.split(command), stdout=subprocess. Read data from stdout and stderr, until end-of-file is reached. Wait for process to terminate. The optional input argument should be data to be sent to the child process, or None, if no data should be sent to the child. The type of input must be bytes or, if universal_newlines was True, a string. communicate() returns a tuple (stdout_data

I was able to solve this by using select.select()

process = subprocess.Popen(
    cmd,
    stdout=subprocess.PIPE,
    stderr=subprocess.PIPE,
    close_fds=True,
    **kw
)

while True:
    reads, _, _ = select(
        [process.stdout.fileno(), process.stderr.fileno()],
        [], []
    )

    for descriptor in reads:
        if descriptor == process.stdout.fileno():
            read = process.stdout.readline()
            if read:
                print 'stdout: %s' % read

        if descriptor == process.stderr.fileno():
            read = process.stderr.readline()
            if read:
                print 'stderr: %s' % read
        sys.stdout.flush()

    if process.poll() is not None:
        break

By passing in the file descriptors to select() on the reads argument (first argument for select()) and looping over them (as long as process.poll()indicated that the process was still alive).

No need for threads. Code was adapted from this stackoverflow answer

subprocess – Work with additional processes, The subprocess module defines one class, Popen and a few wrapper functions that Use check_output() to capture the output for later processing. It is also possible watch both of the streams for stdout and stderr, as with popen3(). The "repeater.py: exiting" lines come at different points in the output for each loop style. popen2.Popen3 and popen2.Popen4 basically work as subprocess.Popen, except that: Popen raises an exception if the execution fails. the capturestderr argument is replaced with the stderr argument. stdin=PIPE and stdout=PIPE must be specified.

subprocess.Popen Python Example, Popen(). They are from open source Python projects. You can vote up the examples Queue() handler = ChangeHandler(shared_queue) observer.​schedule(handler, basestring), bufsize=bufsize, stdin=PIPE, stdout=PIPE, stderr=subprocess. it # comes from the shutdown of the interpreter in the subcommand. stderr  popen2.Popen3 and popen2.Popen4 basically work as subprocess.Popen, except that: Popen raises an exception if the execution fails. the capturestderr argument is replaced with the stderr argument. stdin=PIPE and stdout=PIPE must be specified. popen2 closes all file descriptors by default, but you have to specify close_fds=True with Popen.

Interacting with a long-running child process in Python, It comes with several high-level APIs like call, check_output and it, terminating it cleanly and getting all the server's stdout and stderr when done. (meaning that Popen returns immediately and the child process runs in the background). The sample is similar except for how stdout is handled; there's no  The process.communicate () call reads input and output from the process. stdout is the process output. stderr will be written only if an error occurs. If you want to wait for the program to finish you can call Popen.wait ().

pynacl/log_tools.py - native_client/src/native_client, to stdout. Otherwise log to stderr at INFO level and do not print. subprocess output unless there is an error If a logging file handle is set, always emit all output to it. Popen(command, Capture the output as it comes and emit it immediately. As you might infer from the above example, stdout and stderr both get piped to your own stdout and stderr by default. We can inspect the returned object and see the command that was given and the returncode: >>> completed_process.args 'python --version' >>> completed_process.returncode 0 Capturing output

Comments
  • I really can't think of any solution that doesn't use threads. Is there any particular reason you want to avoid them?
  • If all you did want to do is print to terminal, that's the default if you take out the PIPEs. Also it wouldn't guarantee exactly the right order, but could you put the stdout and stderr reads in the same while True loop?
  • The order needs to be preserved.
  • @dano I can't use threads because this code would get executed in remote (restricted) systems where threads cannot be spawned
  • related: Subprocess.Popen: cloning stdout and stderr both to terminal and variables
  • Yes, I've tested it: it does not preserve the order. To disable buffering, use -u flag (if the child is python) or use stdbuf utility (or its analogs) or pseudo-tty (pty, pexpect modules). Some programs provide a special flag e.g., grep's --line-buffered.
  • you cannot control buffering of other tools/shells that buffer. The code in this answer does work and preserves the order.
  • you can control the buffering -- my previous comment mentions several methods. Try your code with mix_out.py from your question. Your code won't preserve the order unless you pass -u flag to python executable or run it using stdbuf or run it with pseudo-tty.