this is a planning question. I am preparing to run a migration with multiple servers moving multiple on prem drives to Google Shared drives.
I will be creating Service accounts to run these migrations and will have domain delegation to use different accounts to create the files on the Target shared drives.
Google Drive API Queries per 100 seconds 20,000
Google Drive API Queries per 100 seconds per user 20,000
Google Drive API Queries per day 1,000,000,000
I am trying to understand the critical path, is it going to be the Drive api calls rclone Service AC. or the number of user accounts I specify with --drive-impersonate?
If I have 10 servers, and each server has 6 - 8 drives to be copied to individual 6 - 8 matching Google Drives (per server).
I am thinking of running all 10 servers over the same period. but doing each drive in serial.
rclone running 4 processes per Server x 10
Would you match Service Accounts to servers? or only a couple of SA's
10 file_admin's (less or more?)
My thinking is less SA's and at least file_admins / server (so 10 users copying files.) but that is just my first though.
Note: the goal is to estimate the requirements and time for the Migration window. I have looked at the time to do one server of differnet sizes.
Rclone will keep within the 20,000 queries per 100 seconds itself due to its own rate limiting. None of these 3 limits will be a problem in practice.
The problem you will hit is the 750 GB/Day upload limit. This is per user or service account I believe.
You can work around that in various ways, but using --bwlimit 8M is the easiest. Using more service accounts will help here - to maximise this you'll need 1 SA per running rclone. Note 8MiB/s is about 64 Mbit/s - how many rclone's you run needs to take into account your upload bandwidth.
good points. We are doing precheck for 400k objects and 20 level deep on folders.
I had not factored in the 750 GB as they are separate sites but all part of the same primary domain.
Do you know if that is domain centric, as in if I created multiple GCP projects so that the api key and SA were in different projects would that be per project? each site has 8 drives but in the order of 100 - 200 GB But yes 10 sites would be well over 750GB.