I'm looking to write a script that takes a .txt
filename as an argument, reads the file line by line, and passes each line to a command. For example, it runs command --option "LINE 1"
, then command --option "LINE 2"
, etc. The output of the command is written to another file. How do I go about doing that? I don't know where to start.

- 27,993

- 763
6 Answers
Another option is xargs
.
With GNU xargs
:
xargs -a file -I{} -d'\n' command --option {} other args
{}
is the place holder for the line of text.
Other xargs
generally don't have -a
, -d
, but some have -0
for NUL-delimited input. With those, you can do:
< file tr '\n' '\0' | xargs -0 -I{} command --option {} other args
On Unix-conformant systems (-I
is optional in POSIX and only required for UNIX-conformant systems), you'd need to preprocess the input to quote the lines in the format expected by xargs
:
< file sed 's/"/"\\""/g;s/.*/"&"/' |
xargs -E '' -I{} command --option {} other args
However note that some xargs
implementations have a very low limit on the maximum size of the argument (255 on Solaris for instance, the minimum allowed by the Unix specification).

- 544,893

- 27,993
Use while read
loop:
: > another_file ## Truncate file.
while IFS= read -r line; do
command --option "$line" >> another_file
done < file
Another is to redirect output by block:
while IFS= read -r line; do
command --option "$line"
done < file > another_file
Last is to open the file:
exec 4> another_file
while IFS= read -r line; do
command --option "$line" >&4
echo xyz ## Another optional command that sends output to stdout.
done < file
If one of the commands reads input, it would be a good idea to use another fd for input so the commands won't eat it (here assuming ksh
, zsh
or bash
for -u 3
, use <&3
instead portably):
while IFS= read -ru 3 line; do
...
done 3< file
Finally to accept arguments, you can do:
#!/bin/bash
file=$1
another_file=$2
exec 4> "$another_file"
while IFS= read -ru 3 line; do
command --option "$line" >&4
done 3< "$file"
Which one could run as:
bash script.sh file another_file
Extra idea. With bash
, use readarray
:
readarray -t lines < "$file"
for line in "${lines[@]}"; do
...
done
Note: IFS=
can be omitted if you don't mind having line values trimmed of leading and trailing spaces.

- 1,275
Keeping precisely to the question:
#!/bin/bash
# xargs -n param sets how many lines from the input to send to the command
# Call command once per line
[[ -f $1 ]] && cat $1 | xargs -n1 command --option
# Call command with 2 lines as args, such as an openvpn password file
# [[ -f $1 ]] && cat $1 | xargs -n2 command --option
# Call command with all lines as args
# [[ -f $1 ]] && cat $1 | xargs command --option

- 231
The best answer I found is:
for i in `cat`; do "$cmd" "$i"; done < $file
EDIT:
... four years later ...
after several down votes and some more experience I'd recommend the following now:
xargs -l COMMAND < file

- 265
-
4This will invoke the command once for each *word* in the file. Also, you should always quote references to shell variables (as in
do "$cmd" "$i";
) unless you have a reason not to; if the file contained a*
as a word by itself, your code would run$cmd *
, which, of course, would run the command with a list of the files in the current directory. – G-Man Says 'Reinstate Monica' Jun 07 '15 at 05:13 -
1@G-Man, except in
zsh
, the\
cat`` would already expand the*
(the unquoted$i
could still expand some wildcard (a second round) if the expansion of\
cat`` introduces some). In any case, that approach is wrong indeed. – Stéphane Chazelas Jul 18 '16 at 15:23 -
2+1. This will work fine, if each line contains only the input needed for the command being used without spaces. – C-- Mar 12 '17 at 06:02
-
1This acts on each word, is there a variant of this that acts on each line? – thebunnyrules Jan 03 '18 at 09:33
-
-
2
-
An enhancement for when you need more than 1 argument:
xargs -l -a files.txt -i echo begin {} end
. In my case I needed to mv the files in the .txt to another dir – Syclone0044 Jun 23 '21 at 15:53
sed "s/'/'\\\\''/g;s/.*/\$* '&'/" <<\FILE |\
sh -s -- command echo --option
all of the{&}se li$n\es 'are safely shell
quoted and handed to command as its last argument
following --option, and, here, before that echo
FILE
OUTPUT
--option all of the{&}se li$n\es 'are safely shell
--option quoted and handed to command as its last argument
--option following --option, and, here, before that echo

- 58,310
ed file.txt
%g/^/s// /
2,$g/^/-,.j
1s/^/command/
wq
chmod 755 file.txt
./file.txt
Take all the lines of a file and pass them as arguments to a single command i.e.,
command line1 line2 line3 ....
If you need the --option
flag to precede each line change the second command to:
%g/^/s// --option /

- 101
-
1
-
3I didn't downvote you, but the person who did probably had these reasons: (1) This doesn’t do what the question asks for. This invokes the command once with all the contents of the file on the command line; rather than once per line. (2) This does nothing to handle characters that are special to the shell (that might be in the file); e.g.,
'
,"
,<
,>
,;
, etc. (3) This creates an unnecessary temporary file. (4) Things like this are generally done with “here documents”. (5) Youred
commands are clumsy; the first two commands can be reduced to%s/^/ /
and%j
. – G-Man Says 'Reinstate Monica' Jun 07 '15 at 05:40
<file xargs -L 1 -I{} command --option {} other args
– iruvar Aug 12 '14 at 12:13