I have two Remotes which are different subdomains' Sharepoint sites(Microsoft OneDrive). One of them is 300GB with 100K files, other is empty. At the end of this process, 300GB files will be copied to new one which is empty site.
Important info is 300GB is stored on Cloud. I have no files to access on local.
So, I found this little guy and this is works: rclone copy "remote101:folderX" "remote202:folderY" -P
But it was slow when I use first usage so I changed the code down below: rclone copy "remote101:folderX" "remote202:folderY" -P --bwlimit off:off --transfers=50 --checkers=50 --max-backlog=50
This is the right way to copy huge files one cloud destionation to another one? If it is not then you help me with a script?
Additionally, I encountered with an issue, and I can't add --ignore-size ERROR : Attempt 1/3 failed with 1 errors and: corrupted on transfer: sizes differ 73174739 vs 73174611 Command copy needs 2 arguments maximum: you provided 3 non flag arguments:
Thank you, --max-backlog helps to organize queue to optimize our transfers. But my script/command missing one parameter so I use with this --max-backlog=-1 --order-by size,desc
My last script: rclone copy "remote101:folderX" "remote202:folderY" -P --ignore-checksum --ignore-size --ignore-existing --max-backlog=-1 --order-by size,desc
But Sharepoint really problematic and them often closing my connection
So if bring them together, script will be like: rclone copy "remote101:folderX" "remote202:folderY" --log-file=myLogFile.txt -P -vv --tpslimit=15 --ignore-checksum --ignore-size --ignore-existing --max-backlog=-1 --order-by size,desc
A parameter which is --tpslimit=15 helps our connection to be sure to does not exceed limitation on Sharepoint. (Limit is 20 transaction per second). If you encountered with error, then you should decrease this limit.