Hitting 750GB gdrive limit as fast as possible but without errors

What is the problem you are having with rclone?

I have a 30MB/sec upload connection. This hits the 750GB gdrive upload limit in about 7 hours. I have over 3TB of data to upload (at this rate it will take 4 days). I can set my upload rate limit to like 8MB/sec and leave it for a few days to upload all my data. Is there a way to make is to that I can upload as fast as I can (30MB/sec) for the first 7 hrs and then idle for the remaining 17hrs, until the 24hr clock resets again and I can do another 750GB in the next 7 hrs?

rclone version

rclone v1.53.3

  • os/arch: linux/amd64
  • go version: go1.15.5

Which OS you are using and how many bits (eg Windows 7, 64 bit)

linux x64

Which cloud storage system are you using? (eg Google Drive)

gdrive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone  --stats-log-level NOTICE --bwlimit 8M --order-by size,mixed sync "localdir" remote-drive:```

There no single magic way to that.

I use:

--drive-stop-on-upload-limit

and you can use cron and schedule based on your needs to do it.

You could use a bwlimit timetable which allows the transfer to proceed at 30MB/s for 7 hours then throttles it back to 0.1M/s for the other hours which should keep you under 750G/day.

If it was me, I'd use the 8M/s limit and fire and forget!

Aha, I didn't know I can do that with the -bwlimit setting. I'll give that a try. Thanks so much of the quick reply (and for having such an awesome software!)

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.