In linux only two characters are forbidden in filenames "slash" and "null character". So every character with special meaning in every script language should be escaped, BUT every escape sequence is allowed too in file names! Even worse that i.e. bash some escaping methods escapes only some characters, so to escape large amount of different characters you should use a couple of different escaping methods together, BUT they interfere with each other! Even worse that some commands use some characters for their purposes, and other commands use others, so for every single simple operation on files you should escape file name differently! Even worse that only null character could be used to separate filenames safely, but most commands cannot work with that. Even worse, that in linux basically everything is file... So this seems not only nuisance, but matter of security and stability because large portion of linux is script-based so very flawed!
So show me where i'm wrong... Is it even possible to correctly handle all possible file names?
Clarification. Originally i wanted to:
list files and folders under given path
search list to find ones match to given criteria (age or file pattern or size)
- move matched files and folders to categories i.e. movies Because of complexity of tests it was not possible (or practical) to do it in one command, so i had to pass file name between different commands. Bash globbing was first thing to throw off because of spaces in filenames. Globbing always split filename with spaces to two elements of list. Then i tried use "find". This was better, but much slower, and difficult to use.
I cannot use any special character to escape file name, because i don't know what character might be in file name. After some testing i discovered that is matter of time before any character will occur.
I've tried defined filter like:
audio_ext=(*.mp3 *.wav *.ogg *.mid *.mod *.stm *.s3m *.it *.wma *.669 *.ac3)
Soon i've realized that this way i cannot define filters for multiple uses, because globbing kicks rigths away. So i've disabled globbing and history by set -fH
. Without globbing i had to do expansion by hand
while IFS= read -r -d $'\0'; do
list+=("$REPLY")
done < <( find . -maxdepth 1 -mindepth 1 ${params[@]} -print0 2>/dev/null )
Where params
is array like "-iname" "*.mp3" "-o" "-iname" "*.wav"
etc.
This worked until file had "(" in name. Find returned error about wrong usage.
To tell the truth... I've used batch script for this task until recently for 15 years. Time spend on writing was around one or two afternoons. It had drawbacks and issue with !
in filenames, but generally it worked. Now i have trying almost two months to write it in bash. It's ugly, complicated, very buggy, and it seems it will never work good.
${audio_ext[@]}
as"${audio_ext[@]}"
. If you use unquoted variable expansions, then yes, you are definitely going to have problems. – Kusalananda Apr 13 '18 at 07:48-name
as well. – Kusalananda Apr 13 '18 at 08:04"*.mp3"
,"*.wav"
, etc, but even in simple case instead of one or two seconds, it takes even minutes! – harvald Apr 13 '18 at 10:12@
variable expansion is special. You must quote it for it to work properly. Tryn=('ten' 'forty two' 'one hundred'); for a in "${n[@]}"; do echo "> $a <"; done
and then compare that without the double quotes around${n[@]}
. For bonus points repeat both attempts with*
substituted for@
. – Chris Davies Apr 13 '18 at 11:16audio_ext
(or, to be precise, what you say you were doing) would never have worked, even with the quotes. It isn’t conforming tofind
syntax. What you have now, withparams
, still needs to be quoted (to allow things likeparams=(-iname '*star wars*')
). And even then, it’s a disaster waiting to happen, because the-o
isn’t going to do what you want unless you enclose it in parentheses (again, as perfind
syntax). – Scott - Слава Україні Apr 13 '18 at 21:04