Just my thoughts on the various backup packages - updated as I evaluate them more.
Spideroak One
- Super simplistic interface.
- No way to start/stop backups, only able to run them on schedule
- couldn't tell if there was archival of versions
- proprietary storage server
- linux install is just untar a file onto / (NOT the way to go)
iDrive
- No UI for linux; I don't feel like figuring out the scripts for
Duplicity
- Slow as dirt
- incremental downloads rely on last full?
- linux only
Duplicacy
- CLI isn't very self documenting
- cli only (linux)
- couldn't figure out how to get it to not prompt username/password
qBackup
- Java
- Just loading the UI used 300MB of memory. Scanning my home directory used over 1GB
- Expensive
- No built in scheduler
GoodSync
- Looks annoying to configure - no ui on linux
CloudBerry Backup
- no free version
- filenames/paths are not encrypted on destination
- no stfp support
- ~300MB of memory used to backup 7GB
- completion and failure emails
- installs all the directories as 777
- storage is a full path and filename as a directory (with a : at the end) and then files that are named based on timestamp. (doesn't encrypt filenames)
UrBackup
- no crypto
- more enterprisy
- requires server
- requires opening ports on the client to be discovered
CloudBacko
- Java
- refuses to install or run as a user on linux
- scheduler uses 120MB of memory with nothing configured
duplicati
- Web UI
- .Net/mono
- "beta"
- 175MB of memory used to backup 7GB
- "proprietary" storage format (but open source)
Arq
- Windows/Mac only
- easy to configure; supports s3/b2/sftp
- runs a service to backup
Resilio
- primary sync; used to be btsync and i nixed that one a while ago
SyncBackPro
- windows only
- pain to configure versioning
- couldn't get it to work with minio (it would get buckets, but fail to actually backup)
rclone
- simple rsync like interface for s3/b2
- good for syncing from one 'server' to the cloud
restic
- requires a backup to complete before you can restore
- can backup multiple hosts to same archive (does dedup across all)
- operates a lot like git - files are referenced in snapshots
- able to saturate my 20Mbit up with an 7 year old intel atom cpu.
- no integrated scheduler
Result
I decided to go with Arq for windows to just backup to local servers via sftp. For my fileserver I decided on restic to B2, i tried duplicati but it was very slow at backup. Setting up a cron job to backup won't be so hard, plus it will be nice to be able to different level of backups on different folders, all going to the same archive.
To provide redundancy I'm going to use rclone to push the local backups to B2.. and once there pull them down on the other side.. even though this will cost a bit of money the first time it shouldn't be too much (maybe $30-40).