Max transfers daily for server-side-copy

What is the problem you are having with rclone?

Hello everybody and goodnight !:

I'm trying to automate the copy of a shared folder in GD.
My account and the shared folder are in the same gsuite domain.
The folder i want to copy are moved (from shared menu to GD root with shift+Z = the same as old move to my unit) <----this not the question

... this is the flags and remote i want to fix-repair:

sudo rclone copy "remote1:(folder)/folder" "remote2:order/folder/(folder)" --progress --stats 15s --transfers 1 --tpslimit 8 --max-transfer 500G --retries 1 --low-level-retries 1 --ignore-case-sync --ignore-existing --drive-server-side-across-configs --log-level INFO --log-file=/home/user/rclonelogs/logsrclonecopy.txt

This is a server-side-copy, i assume, and the process are faster, this morning in about 4hours i've copied nearly 750 GB (and this is the daily limit).
The problem is, i only want to copy 500 GB of the 750 GB daily total ... and the flag --max-transfer 500G seemed not to work (in fact doesn't)

In resume: i want only to copy 500 GB for the daily total

Thanks folks !

rclone 1.51.0
linux ubuntu 18.04 LTS x64
GDrive

What is your rclone version (output from rclone version)

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Which cloud storage system are you using? (eg Google Drive)

The command you were trying to run (eg rclone copy /tmp remote:tmp)

sudo rclone copy "remote1:(folder)/folder" "remote2:order/folder/(folder)" --progress --stats 15s --transfers 1 --tpslimit 8 --max-transfer 500G --retries 1 --low-level-retries 1 --ignore-case-sync --ignore-existing --drive-server-side-across-configs --log-level INFO --log-file=/home/user/rclonelogs/logsrclonecopy.txt

The rclone config contents with secrets removed.

Paste config here

A log from the command with the -vv flag

Paste  log here
1 Like

I think you are right - server side copies aren't affected my --max-transfer.

I think this would be reasonably easy to fix though - can you please make a new issue on github - thanks.

1 Like

thanks for answering
...i'll do asap
great

1 Like