0

Can someone please explain to me what is happening here? This is what I have reduced my situation down to:

# make 20 test gifs out of the same source file.
for i in {1..20}; do cp -p ../some-random-source-file.gif "${i}.gif"; done

grab, then process them.

while read f; do echo "→ $f"; ffmpeg -i "$f" -y -loglevel quiet "same.mp4"; done < <(find . -maxdepth 1 -type f -iname "*.gif" -printf "%f\n") → 11.gif → .gif → 9.gif → .gif → 14.gif → 9.gif → 0.gif → 13.gif → 7.gif → 5.gif → 2.gif → .gif → 3.gif → 0.gif → 16.gif → .gif → 8.gif → 8.gif → .gif → 4.gif

I am trying to process all gifs in a directory with mixed files (gifs and non-gifs). But for some reason, as soon as I add the ffmpeg step, the content of the $f variable is sometimes cut off at the beginning.


Extra info:

  • I'm using process substition because I am also logging the files that didn't work. if ! ffmpeg …; then FAILED+=("$f"); fi
  • I'm also generating the new filename via string substitution ${f%.gif}.mp4, but … turns out: that part isn't even relevant for the problem to occur.
  • bash version 5.1.16

I understand that the process substitution can cause timing issues – but then why does it randomly cut off the variable at the beginning and not at the end? And how is anyone to use this construct if it is so unreliable? How else am I to do this?

1 Answers1

2

ffmpeg reads stdin by default so it is consuming some characters. You should add the flag -nostdin:

ffmpeg -nostdin -i "$f"  -y  -loglevel quiet