I’ve got a whole bucket full of old hard drives, CDs and DVDs, and I’m starting the process of backing up as much as still works to a 4TB drive.

It’s gonna be a long journey and lots of files, many prone to being duplicates from some of the drives.

What sorts of software do you Linux users recommend?

I’m on Linux Mint MATE, if that matters much.

Edit: One of the programs I’m accustomed to from my Windows days is FolderMatch, which is a step above simple duplicate file scanning, it scans for duplicate or semi-duplicate folders as well and breaks down individual file differences when comparing two folders.

I see I’ve already gotten some responses, and I thank everyone in advance. I’m on a road trip right now, I’ll be checking you folks recommend software later this evening or as soon as I can anyways.

  • solrize@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    1 day ago

    I’m using Borg and it’s fine at that scale. I don’t know if it would still be viable with 100TB or whatever. The initial backup will be kind of slow but it encrypts everything, and deduplicates it too if I’m not mistaken. In any case, it deduplicates the common situation where you back up another snapshot later. Only the differences get written in the second backup. So you can save new snapshots fairly quickly and without much additional space.

    • over_clox@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      24 hours ago

      I don’t even want this data encrypted. Quite the opposite actually.

      This is mostly the category of files getting deleted from the Internet Archive every day. I want to preserve what I got before it gets erased…

      • solrize@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        22 hours ago

        You can turn off Borg encryption but maybe what you really want is an object store (S3 style). Those exist too.