I’ve got a whole bucket full of old hard drives, CDs and DVDs, and I’m starting the process of backing up as much as still works to a 4TB drive.
It’s gonna be a long journey and lots of files, many prone to being duplicates from some of the drives.
What sorts of software do you Linux users recommend?
I’m on Linux Mint MATE, if that matters much.
Edit: One of the programs I’m accustomed to from my Windows days is FolderMatch, which is a step above simple duplicate file scanning, it scans for duplicate or semi-duplicate folders as well and breaks down individual file differences when comparing two folders.
I see I’ve already gotten some responses, and I thank everyone in advance. I’m on a road trip right now, I’ll be checking you folks recommend software later this evening or as soon as I can anyways.
Not everything is an individual file though, a lot of the stuff needs to be stored and maintained as bulk folders.
I mod operating systems and occasionally games, plus write software. I can’t just dump off all text files into a single folder, that’ll just dump off all readme.txt files off into a single TXT folder, losing association with the project folders from which they came.
Isn’t all the code in git somewhere? I would totally do that for code projects.
I do the same thing with arduino code so I know where you’re coming from.
Not my code, I didn’t even have internet access when I started programming.
I feel you. I started coding before the internet even existed (well technically it existed, just nobody had access to it)