Make a copy of folder on google drive using rclone

Hi,

I have a backup folder uploaded to google drive with name B1 which is of size 2.5 Terabytes. I would like to make a copy of this folder on google drive with another name B2 using rclone.

Note - I do not want to upload again 2.5 Terabytes of data to folder B2 on google drive but would like to make use of folder B1 already available on google drive to make copy with name B2.

Is is possible using rclone. Does rclone provide any command to achieve this.

Sure, you should be able to do a server side copy

rclone copy -P drive:B1 drive:B2

I recommend using the latest beta as we fixed some problems with server side copying on drive recently.

If you want to copy between different drive remotes then you'll need to use the --drive-server-side-across-configs which should work (but may not - that is why it is an option).

Thanks for the command. I have executed it and looks like it is creating copy of B1 into B2.

Only concern is ETA . ETA is shown as 3 days 12 hours 25 minutes and 12 sec

If its server side copy than why ETA is so high.

Is that change before 133? If so - would like to know issue number/link if you have it easily available.

Firstly, you don't need to speculate on if it is server-side or not.
Rclone will tell you at the end of the copy line (server-side copy) if it used that.
I think you need to have -v (verbose output) enabled to see it.

Secondly, despite having a massive effective bandwidth you will still need to iterate though the files to copy them (basically sending a copy command for each file), so that is limited both by the maximum API calls rate (it helps using your own Oauth key), the general latency of commands to the cloud, and the fact that Gdrive in particular appears to have some server-limits in regard to how fast you can create new files (about 2/second).

So TLDR: If there are a LOT of files like tens of thousands or hundreds of thousands then it's going to take a while and you can't do much about it. (increasing the --transfers to something like 8 may help a bit though).

Lastly - be aware that server-side copies are also bound by quotas. The normal daily limit for uploads is 750GB/day. The server-side copy appears to be less than that - but I don't know the exact limit (if anyone else does I'd like to know). I can usually only transfer 2-300'ish GB via server-side copy before the transfer will stall out and not progress (easy to see if you use the -P flag). In any case, don't expect to be able to do the whole transfer in one go. Server-side or not, that much data will have to be scheduled over a few days at least.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.