This solution assumes that the list of files is potentially longer than what could be handled by a single invocation of some external command (so the filenames in input.txt
can't be listed on the command line in one go).
Assuming that the list of positional parameters (i.e., the list of arguments to some script) contains the list of pathnames of files that we'd like to concatenate, and that we'd like to take the header from only the first of these if the output file does not already exist. If the output file already exists, we can assume that the header already was written to the output.
The minimal code for doing this in the shell would be something like
[ ! -e outfile ] && head -n 1 -- "$1" >outfile
awk 'FNR != 1' "$@" >>outfile
Here, head -n 1
is used to get only the header from the very first file and write it to outfile
, if the file outfile
does not already exist. Then, awk
is used to extract all but the first line from all files, appending these lines to outfile
.
This tiny script could be executed with all the input files from your input.txt
file, assuming the filenames in input.txt
are properly quoted or otherwise "simple" (no embedded whitespace characters, or quotes):
xargs sh -c '
out=$1; shift
[ ! -e "$out" ] && head -n 1 -- "$1" >"$out"
awk "FNR != 1" "$@" >>"$out"
' sh Consolidate.csv <input.txt
This uses xargs
to run a small in-line sh -c
script, taking the input from input.txt
as arguments. The output filename, Consolidate.csv
, is given as the first argument and received into out
in the in-line script.
If the list in input.txt
is very long, xargs
will arrange for our in-line script to be called several times, with batches of arguments read from the file.
line
, but use$fname
in the loop? – DonHolgo Jun 02 '22 at 18:52