Hacker News new | past | comments | ask | show | jobs | submit login

> I'll agree with the author partly on software such as rustic/duplicacy - they are not a good solution for long-term archiving, nor are they marketed as such.

I self host a Nextcloud server and I run duplicity each night to back this up to an offsite location. I use incremental backups because I have a data cap. From there each month I duplicate the most recent versions of the Duplicity directory onto a cold hard drive at the same location.

I was surprised to see a couple people in this thread say to not rely on Duplicity for this. What could I be doing better?




My understanding is that there are two issues. First, modern deduplicating backup software like borg/restic/duplicacy store data in a repository in unique chunks. This avoids the issue that incremental backup software like duplicity have where they can create long chains of incremental changes which is slow to restore and increases the likelihood of errors on restore. Second, both deduplicating and incremental backup solutions aren't suggested for long-term archiving as they chop your files into lots of little pieces and the chances of not being able to read the repositories 10 years down the road are high. For that reason it's good to have a local backup in a simple, standard format like tar/zip or just a folder. As an example, see criticism of the Perkeep software [1] which is marketed as long term storage, but uses chunking deduplication for no particularly good reason.

[1] https://perkeep.org


Got it. So perhaps after the Duplicity files have been incrementally uploaded to the remote datacenter, for the cold storage backup of those files rather than simply duplicating the Duplicity files I should unpack them and then rearchive them into a single flat encrypted archive.


>>I should unpack them and then rearchive them into a single flat encrypted archive.

Which is a duplicity full back up (that one would normally do once a month or so).

Deduplication is an additional potential point of failure.


I am hoping to do a Duplicity Full Backup rarely due to data caps. I am hoping I can unpack the incremental Duplicity archives and only if it fails verification will I do a Full Backup.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: