0

I have a program that requires arbitrary number of files. It works like

./code output.txt file1 file2 file3

And I have thousand of input files for it: file1, file2, ..., file1000.

However

I can split the input files in different sets. For example, combining 2 consequent input files:

./code output.txt file1 file2
./code output.txt file3 file4
...

(where output.txt is appending during each call of the code) Or, combining 3 consequent input files:

./code output.txt file1 file2 file3
./code output.txt file4 file5 file6
...

I found that xargs can be helpful for it:

ls file* | xargs -n3

The input is really splits into groups of three files.

But when I use xargs with more commands with 'I' option it passes to ./code just file1:

ls file* | xargs -n3 -I {} sh -c './code output.txt {}; command2; command3'

Could you please point out what am I doing wrong?

qwaiqir
  • 23

1 Answers1

2

Don't use {} as part of the sh -c script text (the issue is similar to the one described in the accepted answer to Is it possible to use `find -exec sh -c` safely?).

Instead:

printf '%s\n' file* | xargs -n 3 sh -c './code output.txt "$@"; command2; command3' sh

This would work as long as no filename contains newlines. If your xargs has the non-standard -0 option (most common implementation have), the following would also work for filenames with newlines:

printf '%s\0' file* | xargs -0 -n 3 sh -c './code output.txt "$@"; command2; command3' sh

The "$@" (the quotes are important) will be expanded to the list of the positional parameters inside the sh -c script. These are the filenames given to the script by xargs. The seemingly useless sh at the very end will be put into $0 in the sh -c script and used in any error messages that that shell produces (it is not part of "$@").

In the zsh shell (but not in e.g. bash or sh), you could instead do

for name1 name2 name3 in file*; do
    ./code output.txt $name1 $name2 $name3
    command2
    command3
done

Related:

Kusalananda
  • 333,661