I’ve got a whole bucket full of old hard drives, CDs and DVDs, and I’m starting the process of backing up as much as still works to a 4TB drive.

It’s gonna be a long journey and lots of files, many prone to being duplicates from some of the drives.

What sorts of software do you Linux users recommend?

I’m on Linux Mint MATE, if that matters much.

Edit: One of the programs I’m accustomed to from my Windows days is FolderMatch, which is a step above simple duplicate file scanning, it scans for duplicate or semi-duplicate folders as well and breaks down individual file differences when comparing two folders.

I see I’ve already gotten some responses, and I thank everyone in advance. I’m on a road trip right now, I’ll be checking you folks recommend software later this evening or as soon as I can anyways.

  • over_clox@lemmy.worldOP
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 day ago

    Block level dedupe doesn’t account for random data at the end of the last block. I want a byte for byte hash level and folder comparison, with the file slack space nulled out. I also want to consolidate all related files into logically organized folders, not just a bunch of random folders titled ‘20250505 Backup Turd’

    I also have numerous drives with similar folder structures, some just minimalized to fit smaller drives. I also have archives from friends, based on the original structure from like 10 years ago, but their file system structures have varied from mine over the years.