So here’s the problem that I have, I have several generations of back ups, which are currently taking over huge amounts of space on my NAS server. I want to be able to go through and process all of the files that are on it while the duplicating, and possibly going through and tagging any files that I find that are helpful. Is anyone aware of a good tool to help accomplish this task. Again because of the nature of the backups, I don’t want to utilize any software I’m not running locally.
Thanks in advance.
How are your backups currently stored, simple copies of the files like you would make with rsync? I assume your on a Linux NAS, in which case fdupes would likely fit the bill. meld would be another option, and it also has a GUI if your NAS isn’t headless.
For future backups restic might be a nice option as it deduplicates itself each time you run the backup. You can set retention policies (i.e. 7 daily, 4 weekly, 2 monthly, etc…) that only keep regulated intervals of backups.
Borg Backup would also fit the bill for backups going forward, especially if OP is still backing up to a local server (as opposed to cloud object storage).
I haven’t tried Borg, but have noticed it mentioned pretty often in data hoarder forums. What do you like about it?
It deduplicates aggressively at the block level. So if your files don’t change much, each additional backup takes very little space. And if a file changes a little, Borg only backs up what’s changed instead of the whole file again.
Borg also has a rich ecosystem of wrappers and tools (borgmatic, Vorta, etc.) that extend its functionality and make it easier to use.
Interesting, sounds like it’s worth checking out. Plus as a star trek fan, I approve of the name 😄
I like Kopia, similar feature set to Borg but I prefer its UI