How I take backups
Two is one and one is none.
This post documents my current backup system, mostly as a reminder for my future self. My laptop is my primary machine; while it usually sits on my desk, it also accompanies me to work a few times a week. Here’s the two-tiered strategy I use to make sure my data is safe.
Tier 1: Clonezilla for Bare-Metal Recovery
My primary, full-system backup solution is Clonezilla. I run it from a bootable USB stick to create a complete, bit-for-bit disk image on an external drive. This ensures that I don’t miss a single configuration file or dependency I might need later. If the laptop’s internal drive ever fails, I can simply swap in a new one, restore the latest Clonezilla image, and be back up and running in a matter of hours.
I don’t encrypt the Clonezilla backups. The external drive is kept physically locked away and is only connected during the backup process. This simplifies things and means there’s no master password for me to forget.
However, Clonezilla isn’t perfect for daily use. It comes with a few drawbacks:
- Its ncurses-based interface can be confusing, and I always worry about accidentally overwriting my source drive instead of the destination.
- Because it creates an identical clone, the image may not work correctly on different hardware. A drive replacement is fine, but restoring to an entirely new laptop could cause compatibility issues. (I can still mount the image and restore individual files, so it’s not a complete deal-breaker).
- Clonezilla backups are not deduplicated. Since I’m imaging the entire drive, I can only store one or two versions on the external disk before running out of space.
- The backup process is time-consuming, and I can’t use my laptop while it’s running. While this guarantees a consistent state (no files are changing), it also means I don’t perform these backups as often as I should. My last one was four months ago, which is far from ideal.
These limitations led me to my secondary backup solution, which is faster, automated, and encrypted.
Tier 2: Restic for Automated, Incremental Backups
Restic is a modern, command-line backup tool. It creates versioned snapshots of specified directories and stores them in a “repository,” which can be a local folder or a remote system. The real magic is that it compresses, deduplicates, and encrypts all data by default. The initial backup can be slow, but subsequent snapshots are incredibly fast because they only copy the changed data blocks.
My restic repository lives on my Synology NAS, and the backups are sent securely over SFTP.
Setup and Configuration
1. Enable SFTP on the Synology NAS
In the Synology Control Panel, go to File Services → FTP. On that screen, find and enable the SFTP service (do not use the standard FTPS service). I kept the default port 22. Next, you need to grant your user permission to connect. Go to Control Panel → User & Group, edit your user, and in the Applications tab, check the “Allow” box for SFTP.
2. Configure the SSH Client
You should now be able to connect to the NAS. If you have many SSH keys, you might get a “Too many authentication failures” error. You can fix this by telling SSH to only use the specific identity file for this connection. To make this permanent and simplify the command, I added the following to my ~/.ssh/config file:
Host nas
HostName nas.example.com
User me
IdentitiesOnly yes
3. Initialize the Restic Repository
With SFTP access configured, you can now initialize a new restic repository on the NAS. This command is run from the laptop; restic connects over SFTP and creates the necessary directory structure.
restic init --repo sftp:nas:/drop/restic-repo
Common Restic Commands
Here is a list of commands I use most frequently.
Create a Backup
This command backs up my documents directory. Restic can store snapshots from multiple directories (and even multiple machines) in the same repository.
restic backup --repo sftp:nas:/drop/restic-repo ~/Documents/
List Snapshots
After a backup, you can confirm it worked by listing the snapshots in the repository.
restic snapshots --repo sftp:nas:/drop/restic-repo
Compare Snapshots
You can see the exact changes between any two snapshots using their IDs.
restic diff --repo sftp:nas:/drop/restic-repo d56ed9d5 1c100c47
Restore Files
Restic can restore a backup in place, but this will overwrite newer files and remove any files not present in the snapshot. That’s a bit too YOLO for my taste, so I always restore to a temporary directory to be safe.
restic restore --repo sftp:nas:/drop/restic-repo --target /tmp latest
Maintain the Repository
Over time, the repository will grow. I periodically run a check to verify its integrity.
restic check --repo sftp:nas:/drop/restic-repo
To clean up old snapshots, I use the forget command with a retention policy. The --prune flag is necessary to actually delete the orphaned data and free up disk space.
# This policy keeps 3 daily, 5 weekly, 7 monthly, and 9 yearly snapshots
restic forget --repo sftp:nas:/drop/restic-repo \
--keep-daily 3 --keep-weekly 5 --keep-monthly 7 --keep-yearly 9
I automate the backup and forget commands using a systemd timer, which runs daily.
Why Use Both?
This two-tiered system gives me the best of both worlds.
- Clonezilla is my disaster recovery plan. I am certain it backs up my whole drive, so no configuration is ever missed. The USB drive is stored offline, making it invulnerable to accidental deletion or malware.
- restic is my daily, automated safety net. It’s always connected and running, making it easy to perform frequent backups and restore individual files quickly. However, being online makes it more vulnerable to user error or a security breach.
By using two different backup solutions, I’m not putting all my eggs in one basket. If a critical bug were ever discovered in Clonezilla or restic that could corrupt backups, I’d still have an alternative to fall back on.