I’m trying to find a good method of making periodic, incremental backups. I assume that the most minimal approach would be to have a Cronjob run rsync
periodically, but I’m curious what other solutions may exist.
I’m interested in both command-line, and GUI solutions.
I don’t. I lose my data like all the cool (read: fool) kids.
I too rawdog linux like a chad
Timeshift is a great tool for creating incremental backups. Basically it’s a frontend for rsync and it works great. If needed you can also use it in CLI
I use Borg backup with Vorta for a GUI. Hasn’t let me down yet.
I use PikaBackup which I think uses Borg. Super good looking Gnome app that has worked for me.
This is the correct answer.
Is it just me or the backup topic is recurring each few days on [email protected] and [email protected]?
To be on topic as well - I use restic+autorestic combo. Pretty simple, I made repo with small script to generate config for different machines and that’s it. Storing between machines and b2.
I have a bash script that backs all my stuff up to my Homeserver with Borg. My servers have cronjobs that run similar scripts.
I like rsnapshot, run from a cron job at various useful intervals. backups are hardlinked and rotated so that eventually the disk usage reaches a very slowly growing steady state.
I also use it. Big benefit is also that you don‘t need a special software to access your backup.
Exactly like you think. Cronjob runs a periodic rsync of a handful of directories under /home. My OS is on a different drive that doesn’t get backed up. My configs are in an ansible repository hosted on my home server and backed up the same way.
Restic since 2018, both to locally hosted storage and to remote over ssh. I’ve “stuff I care about” and “stuff that can be relatively easily replaced” fairly well separated so my filtering rules are not too complicated. I used duplicity for many years before that and afbackup to DLT IV tapes prior to that.
Used to use Duplicati but it was buggy and would often need manual intervention to repair corruption. I gave up on it.
Now use Restic to Backblaze B2. I’ve been very happy.
Restic to B2 is made of win.
The quick, change-only backups in a digit executable intrigued me; the ability to mount snapshots to get at, e.g., a single file hooked me. The wide, effortless support for services like BackBlaze made me an advocate.
I back up nightly to a local disk, and twice a week to B2. Everywhere. I have some 6 machines I do this on; one holds the family photos and our music library, and is near a TB by itself. I still pay only a few dollars per month to B2; it’s a great service.
I run ZFS on my servers and then replicate to other ZFS servers with Syncoid.
Just keep in mind that a replica is not a backup.
If you lose or corrupt a file and you don’t find out for a few months, it’s gone on the replicas too.
Pika Backup (GUI for
borgbackup
) is a great app for backups. It has all the features you might expect from backup software and “just works”.I’ve got a smb server setup with a 12tb server drive. Anything important gets put on there
Edit: fixed spelling
I have scripts scheduled to run rsync on local machines, which save incremental backups to my NAS. The NAS in turn is incrementally backed up to a remote server with Borg.
Not all of my machines are on all the time so I also built in a routine which checks how old the last backup is, and only makes a new one if the previous backup is older than a set interval.
I also save a lot of my config files to a local git repo, the database of which is regularly dumped and backed up in the same way as above.
I use timeshift. It really is the best. For servers I go with restic.
I use timeshift because it was pre-installed. But I can vouch for it; it works really well, and let’s you choose and tweak every single thing in a legible user interface!
Github for projects, Syncthing to my NAS for some config files and that’s pretty much it, don’t care for the rest.