41

In a bash script, I would like to capture the standard output of a long command line by line, so that they can be analysed and reported while initial command is still running. This is the complicated way I can imagine of doing it:

# Start long command in a separated process and redirect stdout to temp file
longcommand > /tmp/tmp$$.out &

#loop until process completes
ps cax | grep longcommand > /dev/null
while [ $? -eq 0 ]
do
    #capture the last lines in temp file and determine if there is new content to analyse
    tail /tmp/tmp$$.out

    # ...

    sleep 1 s  # sleep in order not to clog cpu

    ps cax | grep longcommand > /dev/null
done

I would like to know if there is a simpler way of doing so.

EDIT:

In order to clarify my question, I will add this. The longcommanddisplays its status line by line once per second. I would like to catch the output before the longcommandcompletes.

This way, I can potentially kill the longcommand if it does not provide the results I expect.

I have tried:

longcommand |
  while IFS= read -r line
  do
    whatever "$line"
  done

But whatever (e.g. echo) only executes after longcommand completes.

gfrigon
  • 513

2 Answers2

53

Just pipe the command into a while loop. There are a number of nuances to this, but basically (in bash or any POSIX shell):

longcommand |
  while IFS= read -r line
  do
    whatever "$line"
  done

The other main gotcha with this (other than the IFS stuff below) is when you try to use variables from inside the loop once it has finished. This is because the loop is actually executed in a sub-shell (just another shell process) which you can't access variables from (also it finishes when the loop does, at which point the variables are completely gone. To get around this, you can do:

longcommand | {
  while IFS= read -r line
  do
    whatever "$line"
    lastline="$line"
  done

This won't work without the braces.

echo "The last line was: $lastline" }

Hauke's example of setting lastpipe in bash is another solution.

Update

To make sure you are processing the output of the command 'as it happens', you can use stdbuf to set the process' stdout to be line buffered.

stdbuf -oL longcommand |
  while IFS= read -r line
  do
    whatever "$line"
  done

This will configure the process to write one line at a time into the pipe instead of internally buffering its output into blocks. Beware that the program can change this setting itself internally. A similar effect can be achieved with unbuffer (part of expect) or script.

stdbuf is available on GNU and FreeBSD systems, it only affects the stdio buffering and only works for non-setuid, non-setgid applications that are dynamically linked (as it uses a LD_PRELOAD trick).

Graeme
  • 34,027
  • 1
    @Stephane The IFS= is not needed in bash, I checked this after last time. – Graeme Feb 28 '14 at 16:30
  • 3
    yes it is. It's not needed if you omit line (in which case the result is put in $REPLY without the leading and trailing spaces trimmed). Try: echo ' x ' | bash -c 'read line; echo "[$line]"' and compare with echo ' x ' | bash -c 'IFS= read line; echo "[$line]"' or echo ' x ' | bash -c 'read; echo "[$REPLY]"' – Stéphane Chazelas Feb 28 '14 at 16:31
  • @Stephane, ok, never realised there was a difference between that and a named variable. Thanks. – Graeme Feb 28 '14 at 16:34
  • @Graeme I might not have been clear in my question but I would like to process the output line by line before the longcommand completes (in order to react quickly if the longcommands displays an error message). I will edit my question to make clearer – gfrigon Feb 28 '14 at 16:40
  • @gfrigon, updated. – Graeme Feb 28 '14 at 16:58
  • 1
    @Graeme Thanks! It worked with the stdbuf. – gfrigon Feb 28 '14 at 17:10
5
#! /bin/bash
set +m # disable job control in order to allow lastpipe
shopt -s lastpipe
longcommand |
  while IFS= read -r line; do lines[i]="$line"; ((i++)); done
echo "${lines[1]}" # second line
Hauke Laging
  • 90,279