Here is a way that combines the best of the above suggestions into a simple, efficient, robust command line:
find /path/to/files -iname '*.jpg' -exec mogrify -format pdf {} +
It works fine with filenames that begin with a -
or contain spaces. Note the use of -iname
which is the case-insensitive version of -name
so it will work on .JPG
just as well as .jpg
.
This uses find
to get the file list instead of shell globbing with the *.jpg
wildcard which can result in an 'Argument list too long' error on some systems. Though as @enzotib points in a comment, behavior of using globbing in a for loop is different than for a command's arguments.
Also, find
will handle subdirectories, whereas shell globbing will not unless you happen to have shell-specific features like the **/*jpg
recursive globbing syntax in zsh.
EDIT: I thought I would add another useful feature of find
that I thought of after reading a comment by @IlmariKaronen about re-running the command and only converting files that have changed since the first run.
On the first pass you can touch
a timestamp file after the convert is finished.
find /path/to/files -iname '*.jpg' -exec mogrify -format pdf {} +; touch timestamp
Then add -newer timestamp
to the find
expression to operate on the subset of files whose last-modified time is newer than the timestamp file. Continue updating the timestamp file after each run.
find /path/to/files -iname '*.jpg' -newer timestamp -exec mogrify -format pdf {} +; touch timestamp
This is an easy way to avoid having to resort to a Makefile (unless you're already using one) and it is another good reason why it is worth using find
whenever possible... it has versatile expressiveness while remaining concise.
pdf2searchablepdf
tool I describe here, which can also convert an entire directory of images into a single, searchable PDF. – Gabriel Staples Jan 08 '22 at 07:15