V 1.53.0 --bwlimit flag not working when combined with --drive-server-side-across-configs

I am attempting to copy several TBs of data from one a personal Gdrive to a Teams Drive. The Server Side copy works fine, but it reaches the 750gb upload limit within and hour or two as its transfers somewhere between 200MBs and 400MBs. I tried adding --bwlimit 6.5M to restrict the speed but this is not taking effect.

Can anyone shed any light on why this might be?

rclone v1.53.0

  • os/arch: linux/amd64
  • go version: go1.15

command being used:
rclone sync gdrive-media-mount: gdrive-shared-media: -P -v --bwlimit 6.5M --drive-stop-on-upload-limit --drive-server-side-across-configs --log-file "$LOGFILE"

There is no way to limit server-side copies since it doesn't download/upload those files. It's just a single API call to Google asking it to copy the file from the source to the destination and can't be throttled or rate-limited.

Why do you need to throttle it anyway?

@thestigma says differently in his post here: Google team drive server side copy which lead me to beleive that you can throttle the speed.

I want to throttle the speed because I am transfering TBs of data and trip the 750GB upload limit per day within an hour or two, at which point the script stops due to the --drive-stop-on-upload-limit flag. I want to just run the command and let it continue through at a pace which won't trip the limit until all the data is transfered, instead of having to rerun the command every day.

I could remove the --drive-stop-on-upload-limit flag but then the script would just continue on failing once the limit has been tripped and then stop, having failed for most of the time, still requiring it to be run again the next day.

Plus running the scrtip once each day give me no sense of how much data has actually been transfered in total, without having to keep a track manually of each transfer from each day and tallying it up.

Use a GSuite VPS and don't use server side copy and just let it run or use your local connection and let it run with a bwlimit.

Since the transfer happens on the provider, you can't throttle it.

GSuite VPS is an option I suppose. Is that VPS free forever or for a trial period?

I have a 7TB transfer CAP on my upload (a seed box), so there that to consider. If I transfer at 2.6MBs then this would keep my cap from being used up before it resets. But that then means I am only transfering approx 225GB a day. Which will take approx 120 Days to transfer 27TB. Not appealing

I think my easier option, if not a nice one, is to set a cron job up after midnight each day and let it upload until it reaches the limit. (rinse and repeat) At least I'll have finished transfering the data in approx 37 days.

I'm lazy at heart so rather than doing anything that requires extra work, I'd do your last option and just setup cron and repeat the job every day at a set time until it's done.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.