To avoid having to read the whole files (like wc
does), when you know after the first line that a file has at least 2 lines, I'd do (on a GNU system):
LC_ALL=C gawk -v ORS='\0' '
FNR == 2 {nextfile}
ENDFILE {if (FNR < 2) print FILENAME}' /path/File_* |
xargs -r0 rm -f
That's also more efficient in that it minimises the number of commands being run.
More reliable as it works with arbitrary file names.
As a functional difference with wc
-based solutions: it would not delete files that contain one delimited line followed by one non-delimited one.
That one only returns a non-zero exit status if a file could not be removed (and was there in the first place).
Your problem is that the exit status of that pipe line is the exit status of the right-most command in it (as long as you don't use the pipefail
option).
The right-most command here is the while
loop. The exit status of a loop is that of the last command run in the body of the loop. In your case, it will be the [ "$FN" != total ]
command run on the last line of the input, which will be non-zero unless there is only one /path/File_*
file (in which case wc
doesn't print the total).
If you changed it to:
[ "$CNT" -gt 1 ] || [ "$FN" = total ] || rm -f -- "$FN"
You'd only get a non-zero exit status if the last header file could not be removed.
read
can not read more because there is no more data to read, it will fail. Is that what you mean is an issue? – Kusalananda Apr 28 '20 at 19:30