28

Using bash, how do I copy stderr and stdout to a log file and also leave them displayed on the console?

I would like to do this within the script itself using an exec.

I tried with

exec &>> log.out

echo "This is stdout"
echo "This is stderr" >&2

But the above prints nothing on the console. How can I achieve this in bash?

adarshr
  • 381
  • There is a highly upvoted answer to a similar question on StackOverflow which answers this question quite thoroughly. https://stackoverflow.com/a/692407/208257 – Dan Burton Nov 12 '18 at 15:50

4 Answers4

23

You are looking for tee.

See man tee for details.

To combine it with exec, you have to use process substitution. (See man bash for details.)

exec &> >(tee  log.out)
echo "This is stdout"
echo "This is stderr" >&2
11

I know this is an old post, but why not just do this?

echo "hi" >> log.txt #stdout -> log
echo "hi" | tee -a log.txt #stdout -> log & stdout
echo "hi" &>> log.txt #stdout & stderr -> log
echo "hi" |& tee -a log.txt #stdout & stderr -> log & stdout

And of course, if you want stdout you can just print regularly.

You can do this with any combination of streams you wish, just using those two basic commands.

I know I came here and did not get an easy to understand/implement answer, hopefully this will be help to someone else who is struggling.

By the way, for noobs out there like my previous self, all the tee command does is output the stdin input to both stdout and the file(s) specified as subsequent arguments. -a stands for append, so you don't overwrite the file with every use of the command. If you have further questions, I find this to be a very helpful resource for quickly learning bash.

9

You can do:

: > log # empty log file if necessary
{ { {

  ...the script

} 3>&- | tee -a log >&3 3>&-
exit "${PIPESTATUS[0]}"
} 2>&1 | tee -a log >&2 3>&-
} 3>&1
exit "${PIPESTATUS[0]}"

You could also write it as:

: > log # empty log file if necessary
exec 2> >(tee -a log >&2) > >(tee -a log)

...the script

But because bash is not waiting for those processes started with >(...), that has the nasty effect of sometimes outputting something to the terminal after the command has returned which can have even nastier effects (like silently discarding that output) if the terminal "tostop" attribute is on.

In any case, by making stdout a pipe in both solutions, and because two commands independently output the output and error messages, this will affect output buffering and the order the output and error messages are displayed.

5

One more way of doing it is using redirections within functions.

#!/bin/bash

function1 () {
    echo 'STDOUT from function 1'
    echo 'STDERR from function 1' >&2
}

function2 () {
    echo 'STDOUT from function 2'
    echo 'STDERR from function 2' >&2
}


function3 () {
    echo 'STDOUT from function 3'
    echo 'STDERR from function 3' >&2
}

main() {
    function1
    function2
    function3
}

main 2>&1 |tee log.txt

Here we have a main function which invokes all other functions. Now redirecting STDOUT and STDERR of main function to tee.

Kannan Mohan
  • 3,231
  • 2
    IMHO this is the cleanest way of doing this, at least for simple cases. Thanks. – ACK_stoverflow Dec 03 '16 at 21:56
  • 1
    While this is clean, it may cause commands that emit progress output (e.g. 7z) to not output anything. 7z displays progress on the same line, using Carriage Return (\r), and if run piped, it won't output any progress (maybe it detects the lack of an interactive terminal?). Output can be preserved though by using script – Dan Dascalescu Jun 01 '20 at 10:33