Google Drive for Business - slow

Hi there,

Am I the only one having troubles with Google Drive taking ages to run through the checks? I’m migrating from ACD (it’s a bit more expensive but as of right now it seems to be unlimited even for single user…) and back with ACD the checks were way faster. Even when there were no changes to the source folder, it would have run through it in a few mins, compared to probably 30/40 minutes now.
I’m on Windows.
The command I use normally is:
./rclone.exe sync --checkers=50 --transfers=25 “\192.168.10.15\Backup” “gdrive:/Backup” --backup-dir $destinationHistoricFolder --suffix $suffix

$destinationHistoricFolder is just a powershell variable pointing to another folder in Gdrive.

This also means that small files are moved over slower. I can see that it takes over 25 minutes to actually start going through the files…

The second issue I have, which is related to the above one, is that I have another part of the script that syncs a folder in gdrive with another folder in gdrive.
Imagine something like this:
gdrive\Offsite >> I manually add content to this that is ONLY in gdrive
gdrive\Backups\Offsite >> A sync from gdrive\Offsite
gdrive\Backups\HistoricOffsite >> --backup-dir destination

This is the command:
./rclone.exe sync --checkers=50 --transfers=25 $sourcefolder_offsitedata $destinationfolder_offsitedata --backup-dir $destinationHistoricFolder_offsitedata --suffix $suffix_offsitedata

This one could be sitting there for hours, sometimes 16, 17 hours!
One more thing: when copying large files, the speed is great, even better than ACD, I can nearly saturate my bandwitdth (500mbps in up)

Any help/suggestion is appreciated.
Cheers

You’ll have a better response if you add a -vv to the sync and capture a log of what it is doing.

I did, and that showed me that it was going through the checks (and stopping sometimes due to rate exceeded, but not displaying that in the standard output). I lowered the checks to 25 and I’ve also generated my own API for it. I’m no longer getting (for now) 403 errors and I need to keep testing with gdrive to gdrive.
In the past I was able to run this script hourly, but I won’t mind to run it like twice a day if it does run :slight_smile:

I’ve experienced this same behaviour.

At one time I was syncing the same content to Amazon and Google drive. Checking on Google drive was at least twice as long to complete.

The only setting that helped was adding --size-only

Thanks. I’d rather have it slower than check for just the file size though.

Since you have a Windows system, have you tried Drive File Stream?

Edit: Here's the link

Yep, I’ve applied for it and waiting for an answer. I’m really curious to see how that is going to work.