Google compute engine transfer with the 750GB/day limit?

how is everyone backing up between different gdrive accounts? I currently sit on ~9TB and was trying to backup via gce but setting --bwlimt=8M will take forever and most likely use up the $300 credit before finishing the backup.

4 core 50gb instance is $.134 per hour. It willl take 12-13 days to transfer 9tb. It should take about $40 of credit to do it.

The cheapest config (single core, 10GB disk) already enough for 8M limit.

I think I’m missing something here. How can you transfer 9tb with a 50gb instance? I thought the instance disk size will need to equal the transfer size?

I used this guide as a reference: https://www.reddit.com/r/PlexACD/comments/6e4cwl/tutorial_how_to_transfer_your_data_from_amazon/

My instance is the n1-standard-2 (2 vCPUs, 7.5 GB memory). I’m doing a gdrive to gdrive copy: rclone copy gdrive: backup: -v -c --bwlimit 8M --transfers=15 --stats=10s

The files never actually download fully to the local disk. It gets downloaded and instantly uploaded so it requires very little space.

bleh I should’ve checked disk usage first. Thanks anyway, starting a new 1vCPU instance :confused: