All the target files have been deleted. Of course, when I try to run any deletes again, the files aren't there to delete. Sorry to take your time.
I'm using bash on cygwin.
I have the output of fdupes
in a file. I'm grepping the output to exclude a directory I want to keep intact, and wanting to delete the rest of the files listed.
I have some entries with spaces:
./NewVolume/keep/2009/conference/conference/conference 004.jpg
Which trips up xargs
:
$ cat real-dupes.txt |xargs rm {}
...
rm: cannot remove ‘2009/conference/conference/conference’: No such file or directory`
When I try the -0
switch, it looks like the lines get globbed together:
$ cat real-dupes.txt |xargs -0 rm
xargs: argument line too long
Other questions have answers where the asker is adviced to use find
to feed the arguments into xargs
. That's not helpful in my scenario, because I don't believe that I can easily use find
to identify the duplicates I want to get rid of. Also, the fdupes job ran some 12+ hours, so I really want to use this data set.
As far as I know, fdupes
cannot exclude a directory from its automated delete, so I can't use it out of the box, either.
sed 's/ /\\ /' real-dupes.txt | xargs rm {}
. This won't handle the many other special characters that can be present in filenames, though; for some examples:*$<tab>"'
See this question for more details. – Wildcard Dec 29 '15 at 23:04rm: cannot remove ‘2009/photos’: No such file or directory
-- there may be other characters that are problematic, but the majority of the errors thrown are from this apparent space. Pretty sure I used the normal space bar to name these files, although some originated on NTFS filesystems. – user394 Dec 29 '15 at 23:07sed
solution deleted the files that remained and had spaces in them? – Wildcard Dec 29 '15 at 23:20