This is (as you've noticed) rather complicated; I'll try to explain it. It's helpful to think in terms of the parsing/processing sequence the command(s) go through, and watch what happens at each step. In general, the process looks something like this:
- The shell parses the command line, breaking it into tokens ("words"), substituting variable references etc, and removing quotes and escapes (after their effect has been applied). It then (usually) runs the first "word" as the command name ("find" in these cases), and passes the rest of the words to it as arguments.
find
searches for files, and runs the stuff between its "-execdir
" and ";
" as commands. Note that it replaces "{}
" with the matched filename, but does no other parsing -- it just runs the first arg after "-execdir
" as the command name, and passes the following arguments to that as its arguments.
- In the case where that command happens to be
bash
and it gets passed the -c
option, it parses the argument right after -c
as a command string (sort of like a miniature shell script), and the rest of its arguments as arguments to that mini-script.
Ok, a couple of other notes before I dive into this: I'm using BSD find
, which requires that the directory to search be explicitly specified, so I'll be using find . -execdir ...
instead of just find -execdir ...
. I'm in a directory that contains the files "foo.txt" and "z@$%^;*;echo wheee.jpg" (to illustrate the risks of using bash -c
wrong). Finally, I have a short script called pargs
in my binaries directory that prints its arguments (or complains if it didn't get any).
Question one:
Now let's try out at the two commands in your first question:
$ find . -execdir pargs "{}" \;
pargs got 1 argument(s): '.'
pargs got 1 argument(s): 'foo.txt'
pargs got 1 argument(s): 'z@$%^;*;echo wheee.jpg'
$ find . -execdir "pargs {}" \;
find: pargs .: No such file or directory
find: pargs foo.txt: No such file or directory
find: pargs z@$%^;*;echo wheee.jpg: No such file or directory
This matches your expectation: the first worked (and BTW you could've left off the double-quotes around {}
), and the second failed because the space and filename was treated as part of the command name, rather than an argument to it.
BTW, it's also possible to use -exec[dir] ... +
instead of -exec[dir] \;
-- this tells find
to run the command as few times as possible, and pass a bunch of filenames at once:
$ find . -execdir pargs {} +
pargs got 3 argument(s): '.' 'foo.txt' 'z@$%^;*;echo wheee.jpg'
Question two:
This time I'll take the options one at a time:
$ find . -execdir bash -c "pargs" "{}" \;
pargs didn't get any arguments
pargs didn't get any arguments
pargs didn't get any arguments
"Huh", you say? What's going on here is that bash
is getting run with an argument list like "-c
", "pargs
", "foo.txt
". The -c
option tells bash
to run its next argument ("pargs") like a miniature shell script, something like this:
#!/bin/bash
pargs
...and sort-of passes that "mini-script" the argument "foo.txt" (more on this later). But that mini-script doesn't do anything with its argument(s) -- specifically, it doesn't pass them on to the pargs
command, so pargs
never sees anything. (I'll get to the proper way to do this in the third question.) Now, let's try the second alternate of the second question:
$ find . -execdir bash -c "pargs {}" \;
pargs got 1 argument(s): '.'
pargs got 1 argument(s): 'foo.txt'
pargs got 1 argument(s): 'z@$%^'
bash: foo.txt: command not found
wheee.jpg
Now things are sort of working, but only sort of. bash
gets run with the arguments "-c
" and "pargs " + the filename, which works as expected for "." and "foo.txt", but when you pass bash
the arguments "-c
" and "pargs z@$%^;*;echo wheee.jpg
", it's now running the equivalent of this as its mini-script:
#!/bin/bash
pargs z@$%^;*;echo wheee.jpg
So bash will split that into three commands separated by semicolons:
- "
pargs z@$%^
" (which you see the effect of)
- "
*
", which expands to the words "foo.txt
" and "z@$%^;*;echo wheee.jpg
", and hence tries to run foo.txt
as a command and pass it the other filename as an argument. There's no command by that name, so it gives an appropriate error.
- "
echo echo wheee.jpg
", which is a perfectly reasonable command, and as you can see it prints "wheee.jpg" to the terminal.
So it worked for a file with a plain name, but when it ran into a filename that contained shell syntax, it started trying to execute parts of the filename. That's why this way of doing things is not considered safe.
Question three:
Again, I'll look at the options one at a time:
$ find . -execdir bash -c "pargs \"$@\"" {} \;
pargs got 1 argument(s): ''
pargs got 1 argument(s): ''
pargs got 1 argument(s): ''
$
Again, I hear you say "Huh????" The big problem here is that $@
is not escaped or in single-quotes, so it gets expanded by the current shell context before it's passed to find
. I'll use pargs
to show what find
is actually getting as arguments here:
$ pargs . -execdir bash -c "pargs \"$@\"" {} \;
pargs got 7 argument(s): '.' '-execdir' 'bash' '-c' 'pargs ""' '{}' ';'
Note that the $@
just vanished, because I was running this in an interactive shell that hadn't received any arguments (or set them with the set
command). Thus, we're running this mini-script:
#!/bin/bash
pargs ""
...which explains why pargs
was getting a single empty argument.
If this were in a script that had received arguments, things would be even more confusing. Escaping (or single-quoting) the $
solves this, but still doesn't quite work:
$ find . -execdir bash -c 'pargs "$@"' {} \;
pargs didn't get any arguments
pargs didn't get any arguments
pargs didn't get any arguments
The problem here is that bash
is treating the next argument after the mini-script as the name of the mini-script (which is available to the mini-script as $0
, but is not included in $@
), not as a regular argument (i.e. $1
). Here's a regular script to demo this:
$ cat argdemo.sh
#!/bin/bash
echo "My name is $0; I received these arguments: $@"
$ ./argdemo.sh foo bar baz
My name is ./argdemo.sh; I received these arguments: foo bar baz
Now try this with a similar bash -c
mini-script:
$ bash -c 'echo "My name is $0; I received these arguments: $@"' foo bar baz
My name is foo; I received these arguments: bar baz
The standard way to solve this is to add a dummy script-name argument (like "bash"), so that the actual arguments show up in the usual way:
$ bash -c 'echo "My name is $0; I received these arguments: $@"' mini-script foo bar baz
My name is mini-script; I received these arguments: foo bar baz
This is exactly what your second option does, passing "bash" as the script name and the found filename as $1
:
$ find . -execdir bash -c 'pargs "$@"' bash {} \;
pargs got 1 argument(s): '.'
pargs got 1 argument(s): 'foo.txt'
pargs got 1 argument(s): 'z@$%^;*;echo wheee.jpg'
Which finally works -- for real, even on weird filenames. That's why this (or the first option in your first question) is considered a good way to use find -exec[dir]
. You can also use this with the -exec[dir] ... +
method:
$ find . -execdir bash -c 'pargs "$@"' bash {} +
pargs got 3 argument(s): '.' 'foo.txt' 'z@$%^;*;echo wheee.jpg'
find
manual first and once you learn the basic usage reading the top-voted questions on this site under [tag:find]. Start with Is it possible to usefind -exec sh -c
safely? – don_crissti Jan 10 '19 at 20:33{}
is expanded to a filename which might contain a semicolon or other characters special to the shell. If for example someone creates the file/tmp/foo; rm -rf $HOME
then the two commands above could delete someone's home directory. " – don_crissti Jan 10 '19 at 21:10