So I’ve been trying to set up a system for GPG-encrypted backups (with tar and zstd for preservation of permissions and compression to save space) to B2. Everything seems to be working fairly well, but I’m running into the (obvious) problem that encrypting the same file twice will result in different ciphertexts (and thus the SHA-1 will not match and the file will be re-uploaded).
Does anyone have a good idea as to how to deal with this? It’s clear that, for example, the unencrypted tar.zstd.nnnnnn chunks (I use split on the archive) should have the same checksum if the contents haven’t changed. But I wouldn’t be able to check that without downloading the whole file, which is time-, bandwidth-, and cost-intensive.
I also would like to delete the uploaded GPG files from my local cache after they’ve been uploaded - I’m not sure if I’d be able to do that though if I wanted to only generate the tar files for directories which have changed (which might require decompressing each chunk and joining the chunks together…again a time- and space-intensive procedure).
I’m kind of at a loss here, and any tips would be appreciated
It uses GPG (I don’t want to have to manage yet another set of keys, and I already have a pretty sweet setup going with my Yubikey for both GPG and SSH).
I can specify the keys to use (or import the keys for use with restic).
 Found this:
I’ve deliberately decided to stay away from GnuPG, although I use it regularly. The user interface is plain bad for new users and hard to use the right way. In addition, this introduces complexity and a dependency that I’d like to avoid. When you’re trying to restore a backup because your hard drive just failed, you wouldn’t want to fiddle around with your gpg setup and find a backup of your key.
So it seems restic doesn’t fit my usecase, but thanks for the recommendation!
On another note, how does the rclonecrypt backend handle deduplication? Is there some kind of local cache that keeps track of which files have changed since the last backup?