How do I automate the Gdrive web client's "make a copy" and "rename" workflow used to circumvent download quota limits?

Why is this not how rclone responds by default to download quota exceeded limitations?
I have about 1600 files to download from a drive to my own drive, and simply trying to transfer them has not worked for weeks, but I can still manually open the webclient and do this process one set of files at a time. I simply wouldn't be able to rename this many files easily, or to make sure I've highlighted the right files (1600 files are left out of a 4k+ file directory).

I'm not sure what you are asking as the APIs have limits and bypassing them is not really a good way to interact with Google.

If you have a rclone related question, I'm sure we can help out.

This isn't an API-related issue, to my understanding. When I'm trying to transfer them from a folder I have used the "Add to my Drive" functionality on to a folder I simply have created on my drive, it usually works, but over the past few months, has stopped working altogether. I do not get an error from Gdrive regarding download quotas when this happens, but it instead spins its wheels and never downloads anything, giving cryptic errors after a few hours, but continuing on. It has been a while since I have tried it for long enough to see the exact error.

Is there a rclone related question in there? What are you running?

rclone.exe copy --dry-run --ignore-existing --size-only --verbose --transfers 6 --checkers 4 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s remoterino:DemFiles remoterino:DeezFiles

rclone version is what?

v1.46, windows/amd64

If you run the same command with -vv so you can get debug logs, I'd surmise you are seeing the 100GB daily server side copy limit.

You can use --disable copy and that does not use server side copy, but limits to 750GB per day and uses your bandwidth.

If you want grab a log with -vv and share it, that would confirm it.

Are you saying that each file has server-side copy limits on Google's backend, and that if another user is using these server-side copies, that contributes to the limit total?

I've been running it with -vv for 5 minutes now, and it's still stuck on transferring these rather small files. If it's anything like before, it does this for over an hour before I ever get any sort of error.

Small files are particularly bad for Google as you normally can only create 2-3 per second.

The general rules so far as they are not documented that well and can change are 750GB per day / 100GB server side copies and that is per user.

If you get a log with something in it to ask about, just post it.

1 Like

Here is an excerpt from the log, while we're at it:
2019/05/23 20:12:33 DEBUG : pacer: Rate limited, sleeping for 16.311067344s (50 consecutive low level retries)

2019/05/23 20:12:33 DEBUG : pacer: low level retry 9/10 (error googleapi: Error 503: Backend Error, backendError)

Additionally, this error comes up with many files:
2019/05/23 20:14:11 DEBUG : [file]: Received error: googleapi: Error 500: Internal Error, internalError - low level retry 1/10

As I noted before, you can only do 100GB per day server side. You can do 750GB per day with --disable copy since you are copying server-side on Google Drive now.

Ah, so those are the errors I would expect for things like that? Has anyone tried contacting the google team for more descriptive errors?

Next question: Is there a way to run this job so that I can be the first one to download the file when the download quota is reset, or how do download quotas work on google drive in the first place?

Google uses a lot of the same errors for different things so they are not quite as descriptive all the time unfortunately.

For the second item, I am not sure they publish or explain exactly when the reset times are to my knowledge or how it works. Mostly, folks wait 24 hours or work with commands to either limit the bandwidth so they don't cross the barrier or just let it fail and pick up the next day.

I personally just let it fail and upload the next day if I happen to need to upload more than 750gb per day.

I'm downloading, in this case. Other people are the ones downloading this particular file and tripping the quota.

Oh download is 10TB per day so that would be a heck of a lot of downloading from friends.

That's crazy. Is there any way to race the so-called friends, in this case?

Update: Leaving rclone running for absolute ages, letting it error out on some, etc., eventually leads to some getting through and actually copying. It's completely opaque and never tells you when, but you can just leave it going and hope.