Python Popen: Write to stdout AND log file simultaneously

python subprocess logging
python subprocess log to file
python subprocess output to stdout
python subprocess flush stdout
python decode subprocess output
python subprocess get output
python subprocess example windows
python subprocess capture stdout and stderr

I am using Popen to call a shell script that is continuously writing its stdout and stderr to a log file. Is there any way to simultaneously output the log file continuously (to the screen), or alternatively, make the shell script write to both the log file and stdout at the same time?

I basically want to do something like this in Python:

cat file 2>&1 | tee -a logfile #"cat file" will be replaced with some script

Again, this pipes stderr/stdout together to tee, which writes it both to stdout and my logfile.

I know how to write stdout and stderr to a logfile in Python. Where I'm stuck is how to duplicate these back to the screen:

subprocess.Popen("cat file", shell=True, stdout=logfile, stderr=logfile)

Of course I could just do something like this, but is there any way to do this without tee and shell file descriptor redirection?:

subprocess.Popen("cat file 2>&1 | tee -a logfile", shell=True)

You can use a pipe to read the data from the program's stdout and write it to all the places you want:

import sys
import subprocess

logfile = open('logfile', 'w')
proc=subprocess.Popen(['cat', 'file'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
for line in proc.stdout:
    sys.stdout.write(line)
    logfile.write(line)
proc.wait()

UPDATE

In python 3, the universal_newlines parameter controls how pipes are used. If False, pipe reads return bytes objects and may need to be decoded (e.g., line.decode('utf-8')) to get a string. If True, python does the decode for you

Changed in version 3.3: When universal_newlines is True, the class uses the encoding locale.getpreferredencoding(False) instead of locale.getpreferredencoding(). See the io.TextIOWrapper class for more information on this change.

Python Popen: Write to stdout AND log file simultaneously, You can use a pipe to read the data from the program's stdout and write it to all the places you want: import sys import subprocess logfile = open('logfile', 'w')  When the program is run the string is written to the log file but nothing appears on my screen. Where's the screen output? It looks like the superclass's write() method is getting called instead of the StdoutLog instance's write() method. The python documentation says 'print' should write to sys.stdout.write() but that doesn't seem to be happening.

To emulate: subprocess.call("command 2>&1 | tee -a logfile", shell=True) without invoking the tee command:

#!/usr/bin/env python2
from subprocess import Popen, PIPE, STDOUT

p = Popen("command", stdout=PIPE, stderr=STDOUT, bufsize=1)
with p.stdout, open('logfile', 'ab') as file:
    for line in iter(p.stdout.readline, b''):
        print line,  #NOTE: the comma prevents duplicate newlines (softspace hack)
        file.write(line)
p.wait()

To fix possible buffering issues (if the output is delayed), see links in Python: read streaming input from subprocess.communicate().

Here's Python 3 version:

#!/usr/bin/env python3
import sys
from subprocess import Popen, PIPE, STDOUT

with Popen("command", stdout=PIPE, stderr=STDOUT, bufsize=1) as p, \
     open('logfile', 'ab') as file:
    for line in p.stdout: # b'\n'-separated lines
        sys.stdout.buffer.write(line) # pass bytes as is
        file.write(line)

Python Popen: Write to stdout AND log file simultaneously , I am using Popen to call a shell script that is continuously writing its stdout and stderr to a log file. Is there any way to simultaneously output the log file  I’m writing a python script that uses subprocess.Popen to execute two programs (from compiled C code) which each produce stdout. The script gets that output and saves it to a file. Because the output is sometimes large enough to overwhelm subprocess.PIPE, causing the script to hang, I send the stdout directly to the log file.

Write to terminal byte by byte for interactive applications

This method write any bytes it gets to stdout immediately, which more closely simulates the behavior of tee, especially for interactive applications.

main.py

#!/usr/bin/env python3
import os
import subprocess
import sys
with subprocess.Popen(sys.argv[1:], stdout=subprocess.PIPE, stderr=subprocess.STDOUT) as proc, \
        open('logfile.txt', 'bw') as logfile:
    while True:
        byte = proc.stdout.read(1)
        if byte:
            sys.stdout.buffer.write(byte)
            sys.stdout.flush()
            logfile.write(byte)
            # logfile.flush()
        else:
            break
exit_status = proc.returncode

sleep.py

#!/usr/bin/env python3
import sys
import time
for i in range(10):
    print(i)
    sys.stdout.flush()
    time.sleep(1)

First we can do a non-interactive sanity check:

./main.py ./sleep.py

And we see it counting to stdout on real time.

Next, for an interactive test, you can run:

./main.py bash

Then the characters you type appear immediately on the terminal as you type them, which is very important for interactive applications. This is what happens when you run:

bash | tee logfile.txt

Also, if you want the output to show on the ouptut file immediately, then you can also add a:

logfile.flush()

but tee does not do this, and I'm afraid it would kill performance. You can test this out easily with:

tail -f logfile.txt

Related question: live output from subprocess command

Tested on Ubuntu 18.04, Python 3.6.7.

Redirecting subprocesses' output (stdout and stderr) to the logging , The difficulty I faced is that with subprocess I can redirect stdout and stderr only using a file descriptor. I did not find any other method, but if there is one please let  Python subprocess.STDOUT() Examples. The following are code examples for showing how to use subprocess.STDOUT(). They are extracted from open source Python projects. You can vote up the examples you like or vote down the ones you don't like. You can also save this page to your account.

Python3.5 subprocess not writing on terminal while writing print logs , should produce live output as well as write in file at the same time. My_Tests.py : from subprocess import Popen, PIPE, STDOUT with Popen("  You could create two handlers for file and stdout and then create one logger with handlers argument to basicConfig. It could be useful if you have the same log_level and format output for both handlers: This works for me with python 3 but not with 2.7. I know python 2 is soon in end of life, but I'm curious why

Subprocesses, Subprocesses¶. Source code: Lib/asyncio/subprocess.py, Lib/asyncio/​base_subprocess.py PIPE) stdout, stderr = await proc.communicate() print(f'[{​cmd!r} exited with ['ls /zzz' exited with 1] [stderr] ls: /zzz: No such file or directory It is indeed trivial to modify the above example to run several commands simultaneously:. I am using Popen to call a shell script that is continuously writing its stdout and stderr to a log file. Is there any way to simultaneously output the log file continuously (to the screen), or alternatively, make the shell script write to both the log file and stdout at the same time? Again, this pipes stderr/stdout together to tee,

How to get stdout and stderr using Python's subprocess module , from subprocess import Popen, PIPE, STDOUT cmd = 'ls /etc/fstab shell=True, stdin=PIPE, stdout=PIPE, stderr=STDOUT, close_fds=True) output ls: cannot access /etc/non-existent-file: No such file or directory /etc/fstab Popen(['wget', '-r​', '--tries=10', 'http://fly.srk.fer.hr/', '-o', 'log'], stdout=subprocess. I'm working on a Python script and I was searching for a method to redirect stdout and stderr of a subprocess to the logging module. The subprocess is created using the subprocess.call() method. The difficulty I faced is that with subprocess I can redirect stdout and stderr only using a file descriptor.

Comments
  • related: Python subprocess get children's output to file and terminal?
  • You could also create a file like object that encapsulates this functionality and then use that in place of stdout/stderr in the call to Popen.
  • @sr2222 - I like that idea too.... except now that I think about it..., they are operating system pipes, not python objects, so does that even work?
  • @imagineerThis - The code reads stdout until it is closed and then waits for the program to exit. You read before wait so that you don't risk the pipe filling up and hanging the program. You wait after read for the final program exit and return code. If you don't wait, you'll get a zombie process (at least on linux).
  • you might need iter(proc.stdout.readline, '') (due to bug with a read-ahead buffer) and add bufsize=1 to print lines as soon as they are flushed by the child process. call proc.stdout.close() to avoid fd leaks.
  • @tdelaney: no, it is not fixed. try the script: import time; print(1); time.sleep(1); print(2). Your version won't print 1 until the script exits. The word flush in my comment refers to buffers inside a child process that you have no direct control over. If the child doesn't flush its stdout then the output will be delayed. It might be fixed using pexpect, pty modules or stdbuf, unbuffer, script commands.
  • you should mention that you can find the return code in p.returncode after it's done.
  • @kdubs: it is unrelated to the question. Why do you think I "should mention" it?
  • while I agree he didn't ask for that, it seems one ought to check the return status. I was hoping to find it here. would seem to make the answer complete. perhaps "should" was strong.
  • @kdubs I agree that it is a good idea to check the exit status (that is why subprocess.check_call(), subprocess.check_output() functions exist which do it for you). I could have added if p.wait() != 0: raise subprocess.CalledProcessError(p.returncode, "command") but it would distract from the main point: how to emulate tee utility in Python.
  • Above Python 3 version: prints on screen after execution not live