Maximizing copy rate?

What is the problem you are having with rclone?

I am trying to move many terrabytes of data from dropbox to an enterprise shared google drive. I would like to avoid upload and upload-rate limits and otherwise set up my remotes and copy command to have the highest rates possible. The hardware I am using has 32 gigs of ram and I am operating hard wired into gigabit ethernet. The processor puts out 3 Ghz if that is a limiting factor

Run the command 'rclone version' and share the full output of the command.

rclone v1.65.2

  • os/version: darwin 12.7.2 (64 bit)
  • os/kernel: 21.6.0 (arm64)
  • os/type: darwin
  • os/arch: arm64 (ARMv8 compatible)
  • go/version: go1.21.6
  • go/linking: dynamic
  • go/tags: none

This is the newest version accepted by the hardware I am using, it's a mac pro from 2013, if the version number is a limiting factor I could use other hardware, but this machine seems to do what I need so far.

Which cloud storage system are you using?

I am using Google Drive and DropBox, the data is moving from a shared DropBox to a shared Google Drive.

The command you were trying to run

rclone copy --transfers 32 --checkers 32 --verbose BoxTest:/ DriveThread1:

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

[BoxTest]
type = dropbox
token = XXX

[DriveThread1]
type = drive
scope = drive
stop_on_upload_limit = true
metadata_owner = off
token = XXX
team_drive = XXX
root_folder_id = 
upload_cutoff = 3Gi
chunk_size = 512Mi

[DriveThread2]
type = drive
scope = drive
stop_on_upload_limit = true
metadata_owner = off
token = XXX
team_drive = XXX
root_folder_id = 
upload_cutoff = 3Gi
chunk_size = 512Mi

I am planning on using more remote "threads", simultaneously, if what I want is possible, but they will all be identical configs connected to separate accounts.

A log from the command that you were trying to run with the -vv flag

ERROR : Cancelling sync due to fatal error: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded

This log does not represent what I would like help with necessarily, the config and command will run without fatal error pretty consistently until user upload limit, but I figured I should add it since the prompt asks.
I also get a "no buffer space available" error on some of the larger files, but for the most part the larger ones will just be reran at another time and their progress continues from there if I understand correctly

If someone reading this could help me understand how to get help from other people on the forum better I would really appreciate it! I understand that when a question isn't formatted or asked in a concise way that people skip, but if you read this comment and don't want to answer the above question, could you help me understand how to fix it?

Thanks in advance to anyone who's willing to help me with this!

This is probably far too optimistic given serious throttling both google and dropbox apply. I would stick to default values.

Google <-> Dropbox migration has been discussed many times on this forum. Just do a bit of search to find out what people were doing.

Two key points always were:

You have to create and use your own google and dropbox app_ids. Then recreate your remotes using your customised parameters.

Add --tpslimt 12 --tpslimit-burst 0 to your copy command. These are values which worked for most people to limit dropbox throttling.

--transfers 32 --checkers 32 forget it and use defaults. You might even have to go lower than that - YMMV

Overall neither your hardware nor your Internet will be limiting factor but both providers throttling. You can also only transfer 750GB per 24h to Google. So whatever you do this is your daily transfers ceiling.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.