Suppose I duplicate a folder in a mounted Google Drive account by copying it to another location within the same remote. Would Rclone have to temporarily cache, then re-upload, all of that data? Or is it smart enough that it could simply instruct Google Drive to copy data it already has, thus saving my local bandwidth?
Rclone can do server side copies with Google drive so the data will be copied at Google and doesn't have to be downloaded and uploaded.
Thanks! What kind of rate limiting would I hit if I copied a massive folder (several TB, several thousand files)? Is this just not something I should even attempt? I know the docs refer to rate limiting being a concern but they don't go into specifics.
You get 750GB per day no matter how you upload it (copy it, server side copy, etc).
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.