API Errors "drive.files.copy: 100.00%" unsure why?

So I'm trying to copy the contents of "folder" to "folder2" within the same drive.

I'm using just the basic rclone copy "drive:/folder" "drive:/folder2" -vv -P

I'm getting which I assume is the 750gb upload limit, the problem is I don't have any uploads yet. How long does this take to reset? It's been 12 hours since the time this started.

2019-07-09 06:53:33 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 06:53:33 DEBUG : pacer: Rate limited, increasing sleep to 1.949387217s
2019-07-09 06:53:33 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 06:53:33 DEBUG : pacer: Rate limited, increasing sleep to 2.121534515s
2019-07-09 06:53:33 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 06:53:33 DEBUG : pacer: Rate limited, increasing sleep to 4.541240062s
2019-07-09 06:53:34 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 06:53:34 DEBUG : pacer: Rate limited, increasing sleep to 8.844748845s
2019-07-09 06:53:36 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
Transferred: off / 15.266 TBytes, -, 0 Bytes/s, ETA -
Errors: 0
Checks: 4 / 4, 100%
Transferred: 1 / 9050, 0%
Elapsed time: 2m58.7s
Transferring:

You can only do 750GB per day and it is every 24 hours or so. I've heard UTC time, but not sure as it is not documented anywhere.

So all the API errors are just because of the limit being reached?

I didn’t even know I had gone over the limit as I don’t have 750gb worth of files that successfully transferred.

I’ll give it another day and see how things go

There is an upload quota each day which is 750GB per 24 hours. There is a download quota which is 10TB per 24 hours. The download quota is more apparent as the error message says such.

You should also use your own API key if you are not:

https://rclone.org/drive/#making-your-own-client-id

That doesn't effect the daily quotas those, just other rate limits as the default rclone key is definitely over subscribed.

I’m so confused as to how the download limit is breached. I’ve not been downloading any of the files I’ve just been moving from a shared folder to my own drive both from the same account.

I’m using my own client_id but it doesn’t seem to be helping to correct this issue.

The source folder in your command is 'download' and the target is your 'upload'. So if you've copied 750GB, that's your daily quota.

It's been 24h since this started, it's past 12UTC and I'm still getting this issue. I'm so confused.

2019-07-09 17:25:27 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:25:27 DEBUG : pacer: Rate limited, increasing sleep to 1.264975726s
2019-07-09 17:25:28 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:25:28 DEBUG : pacer: Rate limited, increasing sleep to 2.599576469s
2019-07-09 17:25:28 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:25:28 DEBUG : pacer: Rate limited, increasing sleep to 4.078556391s
2019-07-09 17:25:28 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:25:28 DEBUG : pacer: Rate limited, increasing sleep to 8.543298s
2019-07-09 17:25:28 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:25:28 DEBUG : pacer: Rate limited, increasing sleep to 16.356842719s
2019-07-09 17:25:30 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:25:30 DEBUG : pacer: Rate limited, increasing sleep to 16.642799346s
2019-07-09 17:25:46 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:25:46 DEBUG : pacer: Rate limited, increasing sleep to 16.308951028s
2019-07-09 17:26:03 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:26:03 DEBUG : pacer: Rate limited, increasing sleep to 16.171367608s
2019-07-09 17:26:19 DEBUG : pacer: low level retry 3/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:26:19 DEBUG : pacer: Rate limited, increasing sleep to 16.459280386s
2019-07-09 17:26:35 DEBUG : pacer: low level retry 3/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:26:35 DEBUG : pacer: Rate limited, increasing sleep to 16.336039447s
2019-07-09 17:26:52 DEBUG : pacer: low level retry 3/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:26:52 DEBUG : pacer: Rate limited, increasing sleep to 16.028837939s
2019-07-09 17:27:08 DEBUG : pacer: low level retry 3/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:27:08 DEBUG : pacer: Rate limited, increasing sleep to 16.014296301s
2019-07-09 17:27:24 DEBUG : pacer: low level retry 4/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:27:24 DEBUG : pacer: Rate limited, increasing sleep to 16.875767352s
2019-07-09 17:27:40 DEBUG : pacer: low level retry 4/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:27:40 DEBUG : pacer: Rate limited, increasing sleep to 16.650434505s
2019-07-09 17:27:57 DEBUG : pacer: low level retry 4/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:27:57 DEBUG : pacer: Rate limited, increasing sleep to 16.591469929s
2019-07-09 17:28:14 DEBUG : pacer: low level retry 4/10 (error googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded)
2019-07-09 17:28:14 DEBUG : pacer: Rate limited, increasing sleep to 16.534737586s
Transferred: 0 / 1.313 TBytes, 0%, 0 Bytes/s, ETA -
Errors: 0
Checks: 0 / 0, -
Transferred:

Again, the reset is not a perfect science. You are hitting your quota still.

For what it is worth, even with service accounts I seem to be encountering these Drive errors consistently now...

Those are pacer errors = transactions per second. Nothing to do with upload or download limits. Completely normal with gdrive. It's just rclone negotiating an acceptable rate of transactions.

What happens when you let the command run?

Add --tpslimit=4 --tpslimit-burst=40 if you want to eliminate most of the pacer errors.
Stop using -vv. Use -v. It will freak you out less.
Add --drive-server-side-across-configs=true. You shouldn't need it on the same drive but it doesn't hurt.

Thanks for the tip but unfortunately it didn’t help.

2019/07/10 21:32:41 INFO : Starting HTTP transaction limiter: max 4 transactions/s with burst 40
2019-07-10 21:33:10 INFO : Google drive root 'Folder/Files': Waiting for checks to finish
2019-07-10 21:33:10 INFO : Google drive root 'Folder/Files': Waiting for transfers to finish
Transferred: 0 / 762.574 GBytes, 0%, 0 Bytes/s, ETA -
Errors: 0
Checks: 130 / 130, 100%
Transferred: 0 / 210, 0%
Elapsed time: 3m6.3s
Transferring:
Xxxxxxxxx: transferring
Xxxxxxxxx: transferring
Xxxxxxxxx: transferring
Xxxxxxxxx: transferring

I redacted the file names etc but that’s what terminal is spitting out

That's not correct. You also get those errors when you hit your quota for the day, which is a challenge with 403s as they can be rate limits if you have too many transactions per second AND it can be if you hit a limit for the day / your quota.

That would make things slower than you need with the first one as you can do 10 tps via the default quotas on Google. Bursting to 40 would do nothing as the limit is 10 tps so that's not needed either. The default pacer values work fine and nothing else is really needed.

This would allows you do to do server side copies as it's off by default. Turning it on wouldn't change much other than not use your local bandwidth to make a copy and do a server side operation, which uses the same quota.

If you are running with -vv and still seeing nothing upload, someone/something is using your daily quota. You'd want to figure out what or your drive isn't unlimited and a limit is being reached.

I stand corrected. I should have said "are not necessarily related to quota limits". Was running out, should have taken the time to be more accurate. Didn't mean to step on your toes.

I didn't suggest that it would make things faster or that it should be used permanently. Setting --tpslimit=10 or less generally suppresses extraneous rate limit errors that are related to tps quota, which can make it easier to see more meaningful error messages. And yes 10 is the default quota. If other running processes may be using quota then setting it to something well below 10 (like 3, 4, 5) during testing means, again, you don't get as many extraneous messages. In the OP's last paste the tps errors are absent.

I didn't say it would change much. Nor did I suggest that it would change quota. It does however, as you point out, move the transactions off of your local server. When running tests it can make iterative testing and transfers go a bit faster. On very rare occasions if something in the local setup is impeding DL/UL then trying a server side copy highlights that a local issue exists. It is also not a terrible flag to know about for new users (until/if ss copy is turned back on as a default).

Hopefully by now the OP's issue has gone away. What triggered me to offer some alternatives was that the progress chart showed 0 transferred. Typically when you hit the 750G limit there is still some upload activity, even if minimal. Here is an example where I intentionally used all the quota, then repeatedly initiated rclone copy. Each time I ran rclone it would show some upload, not 0. This is the 10th time running rclone copy after the full 750G limit was hit:

Transferred:        1.377k / 98.693 MBytes, 0%, 16 Bytes/s, ETA 10w16h35m53s
Errors:                 0
Checks:                 3 / 3, 100%
Transferred:            1 / 196, 1%
Elapsed time:     1m23.2s