Rclone copy Gdrive for Gdrive?

Guys, good morning! After much breaking the head. I understand how I will pass my files hosted on the company drive on Gdrive to my Gsuite account.

  1. I shared my Gdrive grazing with 9TB with my Gsuite account.

  2. I mounted both drives with the rclone in a vps and tested the shared folder to a folder in Gsuite and I succeeded.

  3. Some doubts have arisen, I have about 18 thousand files the one of about 9tb they are cryptrografados. Do they say that there is a daily limit to the copy? How would I fix this limit? or would not I need to fix it since I’m copying from one drive to the other? Should I also limit the amount of files copied simultaneously?

I intend to use the following command:

rclone copy --drive-shared-with-me Gdrive: Gsuite:New

What could I add to improve?
In case of loss of connection would I like to continue?

rclone -P copy --drive-shared-with-me Gdrive: Gsuite:New

Note: -P allow see the progress.

You can copy cloud to cloud up to 100GB/day.
If you use your vps download and upload to your new acount you can upload up to 750 GB/day

Thanks for the suggestion, but as I limit to 700gb daily this is my doubt, because I can not seem to stop the files. Does the rclone have any command to limit this and renitiate afterwards?

If you use serversidecopy the limit is 100 gb /day

If you use your bandwith the limit is 750 GB /day

Read this thread

If you have the 9 TB of bandwith you can use your vps to download original gdrive and upload to new gdrive

I chose serversidecopy, but am I noticing some errors, will new retry be done automatically or will I have to do it manually afterwards?

05.03.2019 07:40:38 ERROR: Crypt / 71t7dpujrnpnmlie6c6qs94dm8 / aoa7lacukb5f5f97qkj5m9reo0 / a08tkvlg0rp1cbv55bi0c89rr4 / 6c6l173qee26r9qsurqf0ddjt0 / n218rfqd733n986kj37tn8gqeutja45skge1m4pleu5aea03vavikeo8am5uk61t1ls0cpe0qi8n4: Failed to copy: googleapi: Error 403:. User rate limit exceeded, userRateLimitExceeded
Transferred: 0 / 3.762 TBytes, 0%, 0 Bytes / s, ETA -
Errors: 241 (retrying may help)
Checks: 0/0, -
Transferred: 11292/21304, 53%
Elapsed time: 110h9m12.3s
Transferring:

  • Crypt / 71t7dpujrnpnmlie … 3rt1t3ff0efhg75kqllhfm: transferring
  • Crypt / 71t7dpujrnpnmlie … adehkpjvsogrsvfe6faepg: transferring
  • Crypt / 71t7dpujrnpnmlie … sgstf6hpuijp1rnagcp8se: transferring
  • Crypt / aud1eihu8jh78hnf … ug9uummbibicf0d1nqqfsg: transferring

If you are using server side, that error means you most likely ran out of upload quota for the day so retrying wouldn’t help until the next period when the quota resets.

These mistakes do not need to worry me then. Just waiting until the next quota release?

I mean, those errors mean you didn’t get a file copied so I think they are important, but not much you can do until the quota resets.

They will be sent back, that is. Will the rclone make further attempts, or will I have to do this after the end of the full copy?

Rclone retries 10 times until it fails by default. Once you hit quota though, you could retry 1000 times and it wouldn’t matter since you can’t send anything new.

Why not use your own bandwidth? You’ll copy things much faster. You can rent a vps for 3.99/month or $20/yr and get a 1GB connection with no transfer limit. (vpscheap)

That way you’ll get everything transferred more quickly.
At your current rate, it’ll take 93 days to transfer the whole archive if you use server-side. If you use your own bandwidth, you could do it in 13 days.

You can also just use a google compute to do it. If you’re just coping data between google drives, you’re still in the FREE TIER as long as you use a micro instance. You;'re still stuck to a 750GB upload limit though (much higher than the server-side quota limit)

https://cloud.google.com/compute/pricing

EDIT: This post is incorrect information. See Animosity022's post below for the correction.
Below this line is my original post, preserved. Thanks for the good catch, @Animosity022 :grinning:


Summary

I would strongly advise against using a Google Compute to do this, unless your budget is okay with it. (Why not spend $3.99 with someone like vpscheap?)

Google Cloud Compute will cost you a decent amount of money if you choose to do so.

Google Always Free Tier does only includes 1GB Network Egress.

The rate for 1-10TB is $0.11 / GB of network egress. At that rate, you're going to pay just over $1,000 for your "Always Free" Google Compute.

If you go from Google Drive to Google Drive you never ‘egress’ so there is not a cost. If you copied from Google Drive to S3, you’d pay.

Here is the screenshot:

and the link to it:

https://cloud.google.com/compute/pricing#internet_egress

1 Like

Hmm, I searched quite a bit for some docs on that, but couldn’t find anything. Could you link something, or explain how we can be sure that Google Drive to Google Drive, using a GCC’s bandwidth (non-server-side copy) doesn’t count against network egress?

I added to my post above.

Data going to Google is FREE.

https://cloud.google.com/storage/pricing#network-buckets

Many thanks to all for the suggestions, but the fact is that my daily quota is in 750gb even copying the server side. My biggest doubt is with the errors, I wonder if after finalization I could resend them.

2019-05-06 12:05:05 ERROR : Crypt/71t7dpujrnpnmlie6c6qs94dm8/aoa7lacukb5f5f97qkj5m9reo0/mer33hgq3jl2855jljlhon9v8g/vugieoh3e08tlij6jajchfgur0/ri01o4ua0jir45ggotkk17nhaql9v6gdc71b99a9rsei66jsap7u4t435olp6ou39lbv6ar1e870k: Failed to copy: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded
Transferred: 0 / 1.776 TBytes, 0%, 0 Bytes/s, ETA -
Errors: 405 (retrying may help)
Checks: 0 / 0, -
Transferred: 18611 / 24042, 77%
Elapsed time: 186h42m50.8s
Transferring:

  • Crypt/71t7dpujrnpnmlie…09i8cknag54ia8ldffsl90: transferring
  • Crypt/71t7dpujrnpnmlie…66qohecg6s1t445lsskf5m: transferring
  • Crypt/71t7dpujrnpnmlie…m8ed0c1fv1e0eur631ecu6: transferring
  • Crypt/71t7dpujrnpnmlie…ohr1qbrrih5puuoro8qs0q: transferring

you can rerun after and it will fix the syncs. If you’re uploading more than 750GB a day or 100GB server-side than that is your issue. Also verify you are in fact using your own client_id.