With Google account business i have infinite space whit infinite number of team drive.
For each teamdrive folder, i created another team drive folder-bck
folder have a service account backup whit read only permission to team drive.
The same account have R + W permission to folder-bck, without any other user permission.
With crontab setup i run one or multiple sync per day.
Sync crontab with many option, including --backup-dir to prevent losing of old version of file (ransomware protection).
I would like to ask yours opinion on the options used and on the data loss prevention policy.
is the best solution to keep data in sync, including syncing google gsheet and gdoc files?
imho, for myself, need multiple backups in multiple locations, local server and multiple cloud providers.
i use cloud providers that support immutable locks, versioning, and MFA delete.
i also use --backup-dir but that just moves files around inside the same account. so if that account is compromised, all the files are at risk.
seems like all you data and backups are inside google.
if your google account login was comprised, then all your data is at risk.
the hacker could lock you out in a few minutes, ransomware your data.
i use the combination of
wasabi, a s3 clone known for hot storage, for recent backups and fast recovery.
aws s3 deep glacier for older backups at $1.00/TB/month.
I was initially writing the script to copy to s3, but I couldn't make the script compatible with multiple buckets, so I went back to team drive. the problem basically remains the same and that is if they punch the machine with the cron they still have access to the Google JSON and the AWS iam. encrypting the rclone configuration does not allow scheduled execution as it would ask for the password every time