4

I have a perl program that prints to stdout, which I then pipe into a file, like this:

./skript.pl > file.txt 2>&1

This makes the output of that skript buffer. However, I would like to be able to see what the skript is working on at the moment, so I am trying to find a way to enable line buffering in this setup, so that output is printed as soon as a \n appears. (This is the normal behaviour if I just print to the terminal.)

I need a solution that is available without installing anything on CentOS.

Changing stuff inside the perlscript is an option. I already tried $|, but that is doing more than I want: it messes up the cat file.txt output if the end of the current line is not printed yet.

fifaltra
  • 625

3 Answers3

6

In your script you can use:

STDOUT->flush;

to flush the output buffer.

You can even set STDOUT->autoflush(1); globally.

To flush on newlines only try:

STDOUT->autoflush(0);
open STDOUT, ">/tmp/script.out"
.
.
close STDOUT;
Lambert
  • 12,680
  • Hm, autoflush does the same same as putting in $|, i.e. also flushing when lines are not complete, which breaks my ability to grep out error messages. – fifaltra Dec 16 '15 at 10:05
  • Create a handle to STDOUT which points to a file. If it exists it will be overwritten, hence the '>'. Perl will flush automatically when a newline is printed to a file handle. – Lambert Dec 16 '15 at 10:31
  • 1
    This works also when using >/dev/tty for example so no, a regular file is not required. – Lambert Dec 16 '15 at 10:37
3

You may want to run your script with stdbuf, which has an option for line buffering:

stdbuf --output=L --error=L command

The advantage is that you don't have to modify the code. You can also use stdbuf with utilities which don't have the source code available, or when building such utilities is tricky.

1

How about just redirecting via stderr? It should be unbuffered by default:

./skript.pl 2> file.txt 1>&2
woodengod
  • 493