Are you finding duplicates to save disk space?

If you are like me, you end up having a mess of a file system with duplicate files everywhere.

The I’ll copy this here because I’m working on it, and there for backup, and there because I might need it for another project mentality makes to many useless duplicates.

As an example, my Dropbox folder had approximately 5000 duplicate files occupying 2GB. This means 1GB of wasted space. Ouch!

Enter dupd. A fast CLI (command line) tool to find duplicate files. It is a time saver and also a space saver. My usual workflow to find duplicates is something like this:

dupd scan; dupd report | tail -20

and as dupd sorts the offending files by the wasted size I usually run this a couple of times in the folder I want to check and proceed to delete/symlink what I don’t want.

dupd also has two modes for working with SSDs and with traditional HDDs. Check the documentation to see those advanced topics.