1

This blog gives different ways of deleting Big Directories and their relative times for deleting a directory with 500,000 files. The script using perl is reported to be fastest, which is given by

cd yourdirectory
perl -e 'for(<*>){((stat)[9]<(unlink))}'

This answer is also reported in StackExchange-Unix&Linux here. The current script can only delete files just one level under the directory. Can anyone provide an equivalent perl script that deletes all subdirectories and their contents recursively?

Abhinav
  • 161
  • What's wrong with using rm -Rf? – Jan Oct 06 '15 at 19:07
  • agreed with Jan - the blog article referenced in the question seemed to give up too easily when "rm -f *" failed. cd .. and rm -rf the directory. – Jeff Schaller Oct 06 '15 at 19:40
  • What is that stat call doing, and why is it in a boolean context with the unlink ?? (A more legible Perl answer would quite possibly involve File::Path qw/remove_tree/.) – thrig Oct 06 '15 at 19:47
  • 1
    @thrig: Read this for more details. – cuonglm Oct 07 '15 at 01:33
  • 1
    @JeffSchaller: rm -rf can fail with error argument list too long for huge files. – cuonglm Oct 07 '15 at 01:34
  • I saw that @cuonglm, which is why I was unimpressed when the article didn't simplify with "cd ..; rm -rf the_directory" (only one argument to rm) -- or did you mean to imply that rm -rf directory could fail if there are too many files in the directory? – Jeff Schaller Oct 07 '15 at 01:38
  • @JeffSchaller: Ah, I mean rm -rf * – cuonglm Oct 07 '15 at 01:42

0 Answers0