Fast Migration Config


I have two Remotes which are different subdomains' Sharepoint sites(Microsoft OneDrive). One of them is 300GB with 100K files, other is empty. At the end of this process, 300GB files will be copied to new one which is empty site.

Important info is 300GB is stored on Cloud. I have no files to access on local.

So, I found this little guy and this is works:
rclone copy "remote101:folderX" "remote202:folderY" -P

But it was slow when I use first usage so I changed the code down below:
rclone copy "remote101:folderX" "remote202:folderY" -P --bwlimit off:off --transfers=50 --checkers=50 --max-backlog=50

This is the right way to copy huge files one cloud destionation to another one? If it is not then you help me with a script?

Additionally, I encountered with an issue, and I can't add --ignore-size
ERROR : Attempt 1/3 failed with 1 errors and: corrupted on transfer: sizes differ 73174739 vs 73174611
Command copy needs 2 arguments maximum: you provided 3 non flag arguments:

Thank you.

hello and welcome to the forum,

--- onedrive does lots of rate limiting,
best to use default settings as per the little guy.
and then tweak/add/remove flags

--- this did not help

that is a common sharepoint issue, as per the rclone docs.

--- if you need additional help, when you posted, there was a template of questions to answer.

1 Like

Oh thank you for information but, if there was no limit on SharePoint, what script would be like?

And I changed my code like this, then --ignore-size error has disappeared.

rclone copy "remote101:folderX" "remote202:folderY" -P --ignore-checksum --ignore-size --ignore-existing --bwlimit off:off --transfers=20 --max-backlog=20

---- 100K files of total size 300GB is a very small amount of data.

--- the commands look ok. need to do some basic testing.
--bwlimit off:off - not needed, that is the default.
--max-backlog - why use that?

1 Like

Thank you, --max-backlog helps to organize queue to optimize our transfers. But my script/command missing one parameter so I use with this --max-backlog=-1 --order-by size,desc

My last script:
rclone copy "remote101:folderX" "remote202:folderY" -P --ignore-checksum --ignore-size --ignore-existing --max-backlog=-1 --order-by size,desc

But Sharepoint really problematic and them often closing my connection :frowning:

ok, but third request for information.

1 Like

I thank @asdffdsa for great helping.

Here is the keys of solution to you should know:

  1. First in first, if you use Sharepoint, there is no fast copying solution. Because Avoid getting throttled or blocked in SharePoint Online | Microsoft Docs
  2. There is no checking files to safely copied. Because Microsoft OneDrive (

So if bring them together, script will be like:
rclone copy "remote101:folderX" "remote202:folderY" --log-file=myLogFile.txt -P -vv --tpslimit=15 --ignore-checksum --ignore-size --ignore-existing --max-backlog=-1 --order-by size,desc

A parameter which is --tpslimit=15 helps our connection to be sure to does not exceed limitation on Sharepoint. (Limit is 20 transaction per second). If you encountered with error, then you should decrease this limit.

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.