Google Drive download to local drive best practices

Hi all,

I am very new to rclone, and I want to find the best way to download a large (5TB) Google Drive. I have a few questions.

  1. How do I configure RClone with the best possible settings to download from Google Drive? Where do I make these changes, such as chunk size changes? What settings should I be focusing on here for best performance?

  2. I've been struggling with this for a few days using Google Drive for Desktop and have just discovered rclone. I am getting Error 403: The download quota for this file has been exceeded. Does anyone know how I can get around this error, or set rclone to retry these items on error until they succeed?

  3. I ran the command rclone copy Drive: J: . This began running a copy job, and downloading files. I began seeing errors that state that there were duplicate files in the source, and they never ended up in the destination. This is a false error. How can I ignore duplicates in the source?

Thank you so much for your assistance! I am running a huge migration this summer and have to migrate 1000 users to a new workspace tenant. I look forward to learning rclone for this purpose.

  • Corey

hello and welcome to the forum,

help us to help you and answer all the questions in the help and support template.

  1. on the command line using flags. optimized values depend on the number of files, the size of files, etc.
  2. that message is from google, not sure what rclone can do about that.
  3. can run rclone dedupe. unlike gdrive, local file systems do not allow duplicate filenames in the same dir.
    what makes you think this is a false error, can you post a debug log?

Thank you.

  1. I have a lot of files of different sizes. Is there a general best practice, or should I go "flagless"
  2. Does anyone here have experience downloading a large Google Drive to local storage, or doing a workspace to workspace migration on two separate Google Drive accounts? I'm getting roadblocked by Google and they're not being helpful.
  3. I ran some dedupe commands and I see that I have some dupes, In any case I'm going to run dedupe rename. Thank you!

that is what i tend to do.
could tweak --transfers and --checkers

might try using
and from what i read, can only transfer 750GiB per day or could use --bw-limit to not hit that hard limit

i would use service accounts, not oauth

beyond sharing all that basic info, hopefully another will stop by soon and offer more detailed advice.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.