0

I have directory which contains millions of files of small size (output from a big data program). I need to delete those directories but when I type the standard rm * I get:

zsh: sure you want to delete more than 100 files in data/output [yn]? y
zsh: argument list too long: rm

The files all have the same prefix with a unique number appended after like this

data-12343
data-12344
... etc

So I can't even use regular expressions to trash the files in a piecemeal way. Looking for advice and tips on how to do this efficiently and in an automated way.

Thanks.

1 Answers1

4

Use xargs to run your rm command. It will re-run rm, using up the maximum number of arguments each time, until it has completed your original intention of *.

In the words of the man page:

The command line for command is built up until it reaches a system-defined limit (unless the -n and -L options are used). The specified command will be invoked as many times as necessary to use up the list of input items. In general, there will be many fewer invocations of command than there were items in the input. This will normally have significant performance benefits. Some commands can usefully be executed in parallel too; see the -P option.

Jeff
  • 920