How to migrate/download 16 TB from GD school account ASAP without getting banned?

You aren't getting 'banned'. You are hitting daily quotas. Banned == your account is removed.

You can get 750GB per day so best to run with a low bwlimit and it run 24/7.

The problem with downloading locally is that I'm not getting 300 mbps, more like 180 mbps, sometimes 250 but inconsistently, throttling, stopping... It wouldn't be fast anough to finish before June anyway, let alone when I get limited repeatedly. Isn't there a better way? Maybe I can add GSuite accounts to get more than 750 GB/day upload.

If it is only a download to local storage, he can get 10TB.

750GB per day was supposed to be for UPLOAD, wasn't it? I have managed to get 1000-1500 GB before getting limited.

I'm not sure what you are trying to do. Download is 10TB per day.

What command are you running? What rclone version are you running? Do you have a log showing what you are talking about?

You can simply run:

rclone copy remote: /some/local/dir

That's all that's really needed.

1 Like

And make your own API key/client ID if you have not:

https://rclone.org/drive/#making-your-own-client-id

Agree. The newest beta will also multithread downloads which should be able to saturate any bandwidth you have available.

1 Like

I HAD made my own API key/client ID. Tried even changing the oauth, didnt help. OK I'll have to wait now and then try local downloading via rclone. Thanks

You had your own API as in you aren't using it anymore?

Can you share what command you are running, what version of rclone you are running and a log of the error you are getting now? I can't imagine you downloaded 10TB in 24 hours on a 300Mbs link.

I just saw this and thought I'd throw this in here as reference material for everyone. These are the default quotas on the GDrive API.

image

This is extremely useful, thank you. But, please, forgive my naive question: how can I define "one query"?

Basically, I have the same problem as the OP, and I decided to sync a directory with 44 files as a test. Somehow, this was translated as 148 queries (checked the metrics on my API key later)... So, although I understand I can control bwlimit, I don't seem to be able to link bandwidth with number of queries.

Thanks!

What problem are you trying to solve though? 148 queries is fine as that's a non issue.

I'm trying to do the exact same thing as the OP. The difference is that I have roughly 50 TB on local disk that I want to upload and my expectation was to be able to setup --bwlimit in a way that I don't exceed the limits set by the API...

Your use case is you want to download locally from your GD?

You don't need to do anything special.

rclone copy GDrive: /local/path

If you have a link that can pull 10TB in 24 hours. You'd want to limit that.

I just re-read the original question and noticed that my case is different. I have the files on a local server and want to upload all the 50TB to a Team Drive. I know that I want to control bwlimit, but I don't know how to associate it with number of files / size of upload in a way the I can get an optimal estimate for bwlimit.

You just use --bwlimit 8M I think and let it go. 750GB per day so that's quite the time ~65 days or so.

bw-limit won't affect anything but your upload speed. I've spoken to GSuite support on the phone recently as I migrated 35TB from Dropbox to Drive via rclone sync. They spefically said that it's fine to recursively let it hammer the API. I've also noticed if you navigate into the API's console from the dev console you CAN request an increase in API limits, but it requires a pretty detailed justification.

The fact of the matter is you're worrying about the wrong thing. GDrive has a hard-coded intake limit of 750GB/day. It's non-negotiable even on enterprise accounts. Corporate policy.

My advice: Don't care about the bwlimit unless it affects you locally, personally. Hit it as hard as you can - they've given the blessing to do so. Do the transfer as a recursive sync to skip existing files and use --size-only to ensure you don't half-copy a file. They won't penalize your account for this action - I specifically asked about this before initiating my migration.

Nah, folks have gotten the upload limits removed. I believe @calisro has. I haven't bothered to ask as I don't see it as an issue for me.

My response from them was only a month ago. @calisro do you mind elaborating on how you managed this?

Shoot. I think I misspoke. I think he was talking about the API daily limits when I look back at his post. I swear I read about folks getting the upload limits removed but not sure where I saw it.