0

Is there a way to echo output that being piped to the next command? For example lets say that I can reading filenames from a text file and then running a command on that file:

cat files.txt | xargs -I{} -d"\n" command 

The command runs and the output is displayed in the terminal, is there a way to print out the filename as well?

Lets say the input file contains:

file1.txt
file2.txt

Intended output:

file1.txt
[output of command with file1.txt as input]
file2.txt
[output of command with file2.txt as input]

Is there a way to get file1.txt and file2.txt in stdout as well?

Anthon
  • 79,293

4 Answers4

0

You better write function :

Function()
{
cat file1.txt | xargs -I{} -d"\n" command >> /dev/console;
cat file2.txt | xargs -I{} -d"\n" command >>/dev/console;
}
0

Using cat is more work as you have a subprocess. Using a while loop would be faster I guess

while read filename
do
# do something with $filename
done<file_to_be_processed
sjsam
  • 1,594
  • 2
  • 14
  • 22
0

To get the filename you want to use echo to get the content you can use cat. If you want to stay flexible (and e.g. exchange your cat files.txt with a find .... command you should stay with your invocation of xargs but only do it a file at the time:

 cat files.txt | xargs -L 1 /path/to/your_script

with your_script:

#!/bin/bash
echo $1
cat $1
Anthon
  • 79,293
0

Shell command tee does exactly what you are asking for (echoes output that is being piped to the next command). Just pipe it to tee (some_command | tee). Lookup man pages for the command to get exact usage, examples and any other details.

Lucas
  • 2,845