Based on this: Simultaneously calculate multiple digests (md5, sha256)?
I have a folder that has a large number of files that I want to compute the SHA256 hash for.
I used to code segment:
#!/bin/bash
for file in *; do
sha256sum "$file" > "$file".sha &
done
currently to compute the sha256 hash in parallel, except that my computer only has 16 physical cores.
So, the question that I have is how can I use GNU parallel to run this, but only run using the 16 physical cores that I have available on my system and that once a hash has been completed, it will automatically pick up the next file to hash?
Yeah, in some of the folders that I was processing, there were more than 200+ files in that folder, so using the script above, it would try to run 200+ processes on a 16 core system (HyperThreading has been disabled).
It'll run, eventually, but it's massively oversubscribed initially, so I was hoping that it would cycle through the files rather than trying to massively oversubscribe the CPU.
Thank you.
– alpha754293 Mar 10 '21 at 05:39