Recommendations for ACD->GSuite/Other transfers

Thanks for the heads up!

direct sync between both gdrive accounts is much faster, since the local disk is limiting your upload speed.

you can upload to one drive first, then sync the drives directly. once the first drive has a head start, just start the direct sync simultaneously.

1 Like

how can you direct sync?
I bought today an additional gsuite account from ebay, now I have set both filled from my dedicated server, one-by-one
But it would be great to have only one upload bw my dedicated and the gdrive itself, and the 2nd would be a sync between gdrive legit and gdrive ebay :smiley:

I was doing 500-700MB between gdisks, on a couple of accounts, but as I said keep it under 10TB, I got one legit and one gsuites.org one both lockout at excatly 10TB.

1 Like

sorry, that was not explained very well. you have to download anything again, which is causing traffic. does not matter on GCE though.

what i meant with ā€œdirectā€ was using one gdrive account as source in rclone. That would download from drive1 to host RAM and immediately upload to drive 2 again, without any disk interaction.

note: only interesting for GCE, since bandwidth is far greater than disk I/O, which is limited to 120MB/s.

note2: you get 1GBit/s bandwidth per vCPU. Therefore it makes sense to add some cores (stop instance, add cores, start again). i am using 16 cores, works like a charm.

1 Like

I suppose with https://rclone.org/commands/rclone_sync/

1 Like

Thatā€™s also really good to know. Many thanks for the advice in these threads.

So with 16 vcores you would get 16 Gbit?

Prob a little below due to overhead and their backend infrastructure, I was running 5-6Gb on a 8 core one.

docs state that 10GBit/s is cap on most host systems. therefore usually max around 10GBit/s i guess.

Ok, so Iā€™m trying the Compute route, but when trying to add a machine with more than 2TB of disk space I get an error I canā€™t. Any idea what Iā€™m doing wrong?

Google accept only 2TB for the system Partition. Create a empty disk

Right! Tnx,Iā€™ll try that.

Tried that, still get the max 2048 size limit warning with an empty disk..

http://i.imgur.com/Y678ZjJ.png

I can confirm this :slight_smile:. Did a backup of ~10.5TB recently, and got locked out at exactly 10TB - also using the Google Compute :smile:. Insanely fast transfers though.

i think you have to add payment details, then the limit is raised to 10TB. if you need even more, select a contingent in a region of your choice and send a upgrade request. since you pay for every byte stored this is just a formality. took them 2 minutes to approve another 10TB for me.

10TB would be fine though since my Amazon library is only 5TBā€¦ But canā€™t seem to create a disk bigger than 2TBā€¦

Tnx, that worked like a charm =)

Iā€™m tring do install odrive on my ubuntu vps, but i get access denied dwonloading the odrive 64 bit clientā€¦ anyone else have same problem?

What speed are you getting using this method?