Google Drive API

Hi everyone.

I’ve setup rclone with my own oauth and when i try to sync two google drive remotes i get the 403 Forbidden error after 1 hour. Does it happen to anyone else or have i screwed up something on my config?

Regards.

You probably locked your account out, which is the 403 errors.

Rclone doesn’t play well with GD as it makes a lot of calls.

It’s best to really reduce transfers down to ensure you don’t get locked out.

Thanks.

I’m just using the defaults values with 4 transfers and 8 checkers.

Are you doing server side copies? They now seem to be limited to 100GB a day.

Yes. I’m trying to sync from gdrive to gdrive on an f1-micro.

since when google limited that?

About 2 or 3 weeks ago. I think the limit of Google drive to google drive (non-server side copy) is around 10TB/day as well.

yup, 10TB is my experience as well, which is plenty tbh.

I don’t have that much data is just ~1TB, but i’m always getting the 403 after 1h of uploading.

Using 4 transfer and getting 12MB/s overall.

Howdy folks,

I can second that: I’m actually seeing ~127GB/day (93h12m02.70s to copy just 493.064GB) on a ~1TB total server-side Amazon copy.

What I see here (on rclone’s ‘-v -v’ log) is that files copy over quite speedly at first (~127GB in the first hour) then stop copying (with the log showing rclone is retrying on ‘Error 403: User rate limit exceeded, userRateLimitExceeded’ and no files being copied), then after almost exactly 24h things start moving again (Transfers start incrementing once more) until ~127GB more are copied over; rinse and repeat.

This is very frustrating and almost incomprehensible (it’s almost as if Google is trying to limit disk space usage on their service – but heck, if they are using COW like everyone and their mother these days, these server-side copies should cost them basically nothing re: real storage).

Besides being a PITA for us users, doesn’t make sense… :-/

Cheers,

Durval.

If you are sending data from Amazon cloud to Google drive that is not a server-side copy. :slight_smile:

Server-side copying is when you copy data from the same google drive account.

If you are using your own Client ID and secret, don’t. It is better to use rclone’s built in one for copying imo.

Hello @Qwatuz

This could be a good tip, and would make sense as long as the limit is per ClientID and not Google account. I don’t know whether this is the case, but thanks for posting it.

I Will create another remote with rclone’s Client ID and test it in parallel, and come back to report here on how it goes.

Cheers,

Durval.

Hello @QWatuz,

I just did it, and another “rclone copy” running in parallel besides the one which was rate-limited, seems to also be rate-limited:

    2017/05/16 09:18:39 DEBUG : pacer: Rate limited, sleeping for 16.821866378s (13 consecutive low level retries)
    2017/05/16 09:18:39 DEBUG : pacer: low level retry 4/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
    2017/05/16 09:18:55 DEBUG : pacer: Rate limited, sleeping for 16.342231109s (14 consecutive low level retries)
    2017/05/16 09:18:55 DEBUG : pacer: low level retry 4/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
    2017/05/16 09:19:12 DEBUG : pacer: Rate limited, sleeping for 16.797652072s (15 consecutive low level retries)
    2017/05/16 09:19:12 DEBUG : pacer: low level retry 4/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
    2017/05/16 09:19:29 DEBUG : pacer: Rate limited, sleeping for 16.19383165s (16 consecutive low level retries)
    2017/05/16 09:19:29 DEBUG : pacer: low level retry 4/10 (error googleapi: Error 403: User rate limit exceeded, userRateLimitExceeded)
    2017/05/16 09:19:32 INFO  : 
    Transferred:      0 Bytes (0 Bytes/s)
    Errors:                 0
    Checks:                 0
    Transferred:            2
    Elapsed time:      3m4.2s

Therefore it would seem that the ~100-130GB/day is per Google account, not per ClientID…

So, in what aspect exactly would using rclone’s built-in ClientID be “better” than using one’s own (for server-side stuff and whatever)?

Cheers,

Durval

I created my own and asked Google to increase the per user limit a bit to 100 per 3 second. I never got hit my “API” limits so it shouldn’t matter if I use rclone or my own. If you use rclone though you partly share your limits so you could be in a worse position. In my opinion a “known” (my limits) are better than an unknown (rclones shared limit).

Most people have issues with download quotas and not really API limits that I’ve seen. You will see the “queries per 3 minute” quota limits hit in the log but rclone was designed to back off and push as much through as possible.

Hello @calisro,

Interesting! I tried asking Google for more, but they only upped it to 30 requests/second/user.

How did you manage to increase it to 100?

Thanks in advance for any tips,

Durval.

Sorry typo for me. I meant they increased it to 100 per 3 second interval. So really the same as you.

Hi @calisro,

No problem, and thanks for the clarification.

So even as Google scrooges on the API limits, at least they scrooge on it consistently :wink:

Cheers,

Durval.

1 Like