I've moved a lot of files and folders from my old nas to my new smb share. The problem is that someone made entire copies of nearly every folder on that thing. And I need the space.
I'm currently running fdupes
in a screen and outputting everything to a file.
But this will give me a huge list of duplicate files, which I have to filter into duplicate folders.
Are there any existing utilities which can find matching folders? Or can someone suggest a shell script that might do the job?
I'm running Ubuntu 14.04
/
? What does 'there' (in "there you have to specify" to? – Anthon May 28 '16 at 12:55fdupes-r -o output.txt
right now, and the resultis that I have a file containing the location of 1.06 million duplicate files. That's too much for me to find the folders themselves to manually remove one of the duplicates. – blipman17 May 29 '16 at 17:26rmlint --types=dupedirs --progress <path>...
. It will generate a shell scriptrmlint.sh
which you can then inspect and/or run to delete the duplicate dirs. – thomas_d_j Jun 01 '16 at 23:57