0

I often want to edit the files resulting from find or fd like so:

fd myfile | my_script

In my script, vim would be run with all the files from STDIN as arguments like vim "myfile1" "myfile2". The arguments need to be individually double-quoted since they may include spaces & other special characters.

I've tried the following:

files=""
while IFS=$'\n' read -r line; do
    files="$files $line"
done

file $files

The example should run file with the resulting file names as arguments. This kinda works, except having white spaces in the file names break it, and I've tried quoting the variables multiple ways with no success.

How can I run a command with newline-separated inputs as arguments with spaces?

  • 1
    why not use xargs? – LL3 Apr 15 '21 at 11:41
  • 1
    you use mapfile/readarray or put them in an array manually, see https://mywiki.wooledge.org/BashGuide/Arrays and https://www.gnu.org/software/bash/manual/html_node/Arrays.html and https://unix.stackexchange.com/questions/131766/why-does-my-shell-script-choke-on-whitespace-or-other-special-characters – ilkkachu Apr 15 '21 at 11:53
  • Also, forget about the notion of individually quoting the names. You'd only do that if you need to create a shell script with the filenames embedded, and unless you're passing the result through SSH to another instance of the shell, you don't want to do that. What you want to do, is to just process the list of filenames, without them getting mangled. – ilkkachu Apr 15 '21 at 11:55
  • @LL3 xargs eats stdin, which will have unwanted affects on interactive programs like less or vi. e.g. echo foo | xargs vim gives you the warning message Vim: Warning: Input is not from a terminal (and then you have to press ^C to get back to the shell). Command substitution works, though - e.g. vim $(echo foo) – cas Apr 15 '21 at 12:41
  • @cas fair point, I just wondered whether OP knew about xargs at all. BTW, recent GNU's and BSD's xargs have -o to avoid that inconvenience for interactive programs. – LL3 Apr 15 '21 at 13:12
  • @LL3 That -o option sounds interesting and useful, I missed seeing that had been introduced. – cas Apr 16 '21 at 11:58

1 Answers1

1

As you already use bash, use an array.

files=()
while IFS=$'\n' read -r line; do
    files+=( "$line" )
done

file "${files[@]}"

You can also look at xargs, with a delimiter of newline the behavior should be similar.

RalfFriedl
  • 8,981
  • There's almost always a better alternative to a while read loop, or anything involving read, in a shell script. Use mapfile aka readarray, Its entire purpose is to populate an array from stdin. By default, it'll use newlines as the separator. For NUL-separated input, use -d $'\0' - e.g. mapfile -t -d $'\0' files < <(find . -type f -iname '*.txt'). in bash, see help mapfile for details. – cas Apr 15 '21 at 13:06
  • damn. that example find command should end with -print0. – cas Apr 15 '21 at 13:14
  • You could use IFS= read -r line. The newline already separates the lines, so can't really act as a field separator inside one. It's not wrong with IFS=$'\n', but IFS= is perhaps a bit more idiomatic, and works even if you change the line delimiter (read -d) – ilkkachu Apr 15 '21 at 13:20