Currently I'm using the following zsh-snippet to select small batches of files for further processing
for f in $(ls /some/path/*.txt | head -2) ; do
echo unpacking $f
./prepare.sh $f && rm -v $f
done
Is there a better alternative to $(ls ... | head -2)
in zsh?
General overview of my task. I'm creating a data set to train a neural network. Details of that ML task are not important here. The task of the dataset creation requires me to manually process a large bunch of files. To do it, I've copied them to a separate directory. Then I randomly choose several files (first two from ls
output in this example), call some preprocessing routine, review its results, move some of them to the data set being created and remove the rest. After this cleanup I again execute the command above.
Additionally, I'd like to improve my skills in shell programming and learn something new :)
The order in which these "first" files are chosen does not matter, since all of them will be processed in the end.
In other words, I'm working together with PC inside a for
loop and want it to pause after several iterations and wait for me.
Pseudocode.
for f in /some/path/*.txt ; do
echo unpacking $f
./prepare $f
if human wants to review ; then
human is reviewing then cleans, and PC waits
fi
done
The reason for such weird procedure is that preprocessing of one "source" .txt
file creates several dozens other files, I need to view all of them and select a few samples (usually 1-2), suitable to train a network.
I could run for f in /some/path/*.txt ; do ./prepare $f ; done
but this command would create several hundreds of files, and this amount overwhelms.
ls
has its flaws - better avoid it. – FelixJN Jul 26 '22 at 13:48for f in /some/path*.txt
to pause after several iterations and wait while I allow it to continue. – wl2776 Jul 26 '22 at 13:58for f in /some/path*.txt
to pause after several iterations". Do you want to continue with the next two files etc. later? – Bodo Jul 26 '22 at 14:00