Efficiently update (copy) only very few files to Google drive

Using --max-age is indeed another workaround. Thanks for the suggestion. As I cannot guarantee to do a backup every less than 24 hours (or every less than x hours), this option is less attractive to me. I would have to give a rather high --max-age and consequently most of the time too many files that are considered for upload. Or I keep track of the time of the last sync, then adjust --max-age accordingly. Or, as my script demonstrates that a fast file size and timestamp is possible, I might implement this workaround for production use.

Curiously, how long does your weekly sync (without --max-age) run; for how many files?