This blog gives different ways of deleting Big Directories and their relative times for deleting a directory with 500,000 files. The script using perl
is reported to be fastest, which is given by
cd yourdirectory
perl -e 'for(<*>){((stat)[9]<(unlink))}'
This answer is also reported in StackExchange-Unix&Linux here. The current script can only delete files just one level under the directory. Can anyone provide an equivalent perl
script that deletes all subdirectories and their contents recursively?
rm -Rf
? – Jan Oct 06 '15 at 19:07stat
call doing, and why is it in a boolean context with theunlink
?? (A more legible Perl answer would quite possibly involveFile::Path qw/remove_tree/
.) – thrig Oct 06 '15 at 19:47rm -rf
can fail with error argument list too long for huge files. – cuonglm Oct 07 '15 at 01:34rm -rf directory
could fail if there are too many files in the directory? – Jeff Schaller Oct 07 '15 at 01:38rm -rf *
– cuonglm Oct 07 '15 at 01:42