7

I remember reading somewhere (then I forgot where) about a rarely used type of bash pipe that could redirect output line-by-line. In other words, rather than redirecting the output once at the end, the output would be redirected for every newline, and run the receiving process multiple times. I've looked in all the bash docs I can get my hands on, but I can't figure out if this 'pipe' really exists.

Is there such a thing as a line-by-line pipe? Of course the task could easily be accomplished other ways, but I'm curious if there is a more elegant way.

bntser
  • 500

2 Answers2

8

I think you are looking for xargs? So say I want to find all the .bak files in a dir and delete them.

find . -name "*.bak" -type f -print | xargs /bin/rm -f

For each file that is found it pipes the result and removes the file.

http://www.cyberciti.biz/faq/linux-unix-bsd-xargs-construct-argument-lists-utility/

Jeight
  • 2,603
  • This is exactly what was needed, at least from the end result point-of-view. I chose this answer because xargs is a standard command, while unbuffer doesn't seem to be installed on my system. – bntser Oct 10 '13 at 23:36
  • @bntser - which ever works. Glad you solved you issue. – slm Oct 11 '13 at 00:21
5

I think you're thinking of the tool unbuffer. You can use it to disable the buffering that occurs when output is sent from one command to another through a pipe.

With a command like this you won't see any output until a pages worth has buffered:

$ od -c /tmp/fifo | more

You can disable this automatic buffering as follows:

$ unbuffer od -c /tmp/fifo | more

Normally, unbuffer does not read from STDIN. This simplifies use of unbuffer in some situations. To use unbuffer in a pipeline, use the -p flag.

$ process1 | unbuffer -p process2 | process3

References

slm
  • 369,824