Thanks for taking the time to advise me. I appreciate that.
Part of what caused me to head down the path which I have described below, is that I have a different use case in mind, for which I might want to download up to, say, 1/2 a gig of videos (screencasts) per day from Google Drive to my local SSD(s).
In about 15 years of using Google Workspace, I've "amassed" about 1.5 gigs of data on Google Drive. In other words, I have a very small amount of data on Google Drive. Mostly I create Google Docs and emails consisting entirely of text. Therefore, in my case 16GB SSDs are actually, relatively large.
The drives are SSDs; they are not flash drives. I am aware that flash drives are notoriously unreliable. Instead of using a service such as Backblaze or Wasabi, I figured I would simply store my backups offsite.
With 50 SSDs, I plan to store two drives offsite, each week. Of course, in such a case I would end up with roughly a six month supply of weekly backups which I would rotate.
My plan is to run Google Takeout once a week to backup nearly all of my data from Google Workspace. I imagine I'd probably set up a SystemD timer or a cron job to trigger a simple Python script using the PyAutoGUI module to run Google Takeout for me and download the resulting file to each of my SSDs. Yeah, that would be a kludge. But if I wake up in the morning, and see that each of my two SSDs has a file such as takeout-20231001T150536Z-001.tgz on it, then I will be able to (more or less) verify the my kludge ran properly.
That .tar file would nearly have all of my Google Workspace data. I indicated "nearly" because Google Takeout doesn't backup Google Apps Scripts, which seems like an oversite to me. (No, I don't use Git to backup my 1/2 a dozen or so tiny Google Apps Scripts. Normally, I simply copy and paste them each into its own Google Doc. But, obviously, I might forget to do that at some point in the future).
Frankly, I could simply run Google Takeout daily "and be done with it" because 1.5GB/day * 7 days is less than 16GB of SSD storage. However, that seems like overkill to me. But I would like daily backups of my data because, although Google Workspace has apparently never lost any of my data, I'm slightly paranoid about losing data.
See, I spend a lot of time and effort writing stuff up. Ideas float into my head, I put them into a Google Doc (or email), and then I often have a hard time remembering them. If, say, I spend a few hours crafting a Google Doc, normally I download it to my local SSD as an .ODT file (that's LibreOffice Writer format) as a backup.
But I'm tired of doing that. Therefore, I think I'll probably set up a SystemD timer or a cron job to run Rclone daily which will backup all files that have changed within, say, the last 10 days. No, I don't actually need 10 days worth of files, but my Google Docs files are so small, and my 16GB SSDs so relatively large, that I figured, "Why not?"
So yeah, sure, I read about Borg, Restic, Duplicity, and Duplicati. I actually played around with Kopia, but, I eventually realized, they move data to Google Drive, but not from Google Drive. Therefore, they won't help me with my use case described above. And, well, really, because of the relatively very small amount of data I'm looking to backup, why should I even bother using a proper backup tool?
I'm not looking to implement a technically beautiful canonical solution; rather, like most people I simply want to avoid losing data.
I would be glad for you to critique my proposed solution.
Please bear in mind, I'm not a sys admin, er, uh, I mean DevOps guru, nor any sort of engineer whatsoever. I don't even like dealing with this sort of stuff. Unlike 100% of the engineers I've worked with for any significant length of time, who have obviously had their egos tied to the technical solutions they proposed, my ego is not tied to the solution I use.
Therefore, I'd gladly toss out my unconventional, obviously kludgy approach for something "cleaner." I'm not proud of my solution. Frankly, I'd be glad for you to propose something better.