Google Drive Server Side Daily Limit --cant-find-the-flag

I've used this flag in the past, but I cannot seem to find it. I read the docs, and couldn't find what I was looking for there either.

I'm doing a purely server-side copy, not across configs. I'm copying from one folder to another.

In the past, I had seen something that could take me right up to the 750GB/day limit, and if the limit is hit, wait until the next day and continue the transfer. I can't seem to find that, and so I'm currently doing a server-side transfer with --bwlimit 8.5M, but I'm not sure that it's obeying this. Rclone is reporting 25MB/s average transfer speed.

Main thing I'm looking for is that flag that resumes transfer if I hit the limits, or prevents me from hitting it. I can't quite remember what it was. I looked through my past posts, and couldn't find the topic I got the flag from.

is this what you want.

https://rclone.org/docs/#max-transfer-size
https://rclone.org/docs/#cutoff-mode-hard-soft-cautious

I think you are looking for:

 --drive-stop-on-upload-limit
1 Like

Maybe so.

I'm not sure I'm understanding what this does.
From reading the docs, it seems like when it hits the upload limit, the error is fatal and transfer stops completely. I'll need to run the command again to get it to copy. Is that right?

I'm trying to copy a semi large group of files (~30TB) to a new folder within the same drive.

Will I need to run the command daily, after hitting the limits? My goal is to be able to copy the whole thing as fast as possible- 745GB/day would do the trick I believe.

If it's all within the same drive, just use the Web GUI.

If you use any API, you are limited to 750GB per day so 30TB / 750 per day ~ 40 days?

I don't know how to copy folders in the Web GUI. I google searched, and found a bunch of answers that didn't work for me.

Sorry to be helpless, but could you help me with copying within Web GUI?

EDIT: In case it helps, here's the command I'm currently running, and getting rate limits.

while read a; do rclone copy "gdrive:$a" "gdrive:Important/$a" -P --fast-list --use-mmap --bwlimit 8.5M --transfers 8 --drive-stop-on-upload-limit; done < important.gdrive

important.gdrive was generated using rclone lsf gdrive: > important.gdrive and I removed a few folders from the list.

Oh that's right, they don't allow to copy a folder.

I guess you can't do that then and you'd be stuck with the 750GB per day limits.

I guess you are doing server side? That's no bandwidth so that why it wouldn't do anything.

If you share a debug log of the error, that would be great as it should stop. Usually has something like:

context canceled

on the error lines.

Yes, that's right. Well, that explains why I'm running into this problem.
I think the quick and dirty solution will be to just use --disable copy and set --bwlimit 8M so I can still have a small amount of buffer for uploading other stuff. May ratchet that down some, but we'll see.

Thanks for the help. While this isn't [SOLVED] necessarily, I think this workaround will be fine for what I'm doing.

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.