Help on transferring files from google drive to dropbox

What is the problem you are having with rclone?

I am trying to transfer ~ 3 TB of files from google drive to a dropbox team account.

Run the command 'rclone version' and share the full output of the command.

rclone version
rclone v1.65.2
- os/version: darwin 14.3.1 (64 bit)
- os/kernel: 23.3.0 (arm64)
- os/type: darwin
- os/arch: arm64 (ARMv8 compatible)
- go/version: go1.21.6
- go/linking: dynamic
- go/tags: none

Which cloud storage system are you using?

Google Drive to Dropbox (team account)

I use a client id and secret and both accounts.

The command you were trying to run

I tried various commands and variants. Here is the latest I tried:

rclone -P --tpslimit=10 --order-by=size,mixed,75 --transfers=16 --checkers=128 copy gdrive:dir_src dropbox:dir_target

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

rclone config redacted
[gdrive]
type = drive
client_id = XXX
client_secret = XXX
scope = drive
token = XXX
team_drive = 

[dropbox]
type = dropbox
client_id = XXX
client_secret = XXX
token = XXX

A log from the command that you were trying to run with the -vv flag

Here is the current output with -P

Transferred:   	    2.851 GiB / 3.063 GiB, 93%, 414.419 KiB/s, ETA 8m54s
Checks:             64834 / 64834, 100%
Transferred:         5075 / 15219, 33%
Elapsed time:    1h9m20.9s
Transferring:
 * xxx:100% /1.447Ki, 296/s, 0s
 * xxx:100% /1.448Ki, 297/s, 0s
 * xxx:100% /1.452Ki, 297/s, 0s
 * xxx:100% /3.827Mi, 980.785Ki/s, 0s
 * xxx:100% /142.461Ki, 28.511Ki/s, 0s
 * xxx:100% /1.454Ki, 298/s, 0s
 * xxx:100% /1.454Ki, 297/s, 0s
 * xxx:100% /1.456Ki, 298/s, 0s
 * ...

I recently restarted rclone so the numbers are still changing. This command is copying only a small directory on google drive as a test, but even just that directory of about ~ 170 GiB is unable to complete in a reasonable amount of time (e.g., 12 hours).

The bandwidth speed test for uploads and downloads is around 1 Gbps on the computer I am running rclone on.

The reported transfer speed for rclone is around 400-500 KiB/s, which seems very low. I would need to reach ~ 10 MB/s to get to a reasonable transfer time for 1 TB of data. So my current transfer speed seems too low to be able to transfer that amount of data in a reasonable time.

Any advice on how to do this? Am I limited by the Google Drive/Dropbox performance?

Thanks for any tips/advice/experience.

hi,

always use a debug log. often the answer is there and others in the forum can look at it.

sometimes, it is not about the total size, but about total number of files.

maybe this is the issue?
Drive has quite a lot of rate limiting. This causes rclone to be limited to transferring about 2 files per second only. Individual files may be transferred much faster at 100s of MiB/s but lots of small files can take a long time.

if that is correct, then most/all of those flags would not make a difference.
how did you come up with that command and so many flags, instead of defaults?

I would use:

rclone copy gdrive:dir_src dropbox:dir_target --tpslimit 12 --tpslimit-burst 0 -P

too many checkers and transfers won't work here as both gdrive and dropbox heavily throttle everything. Less is better in this situation.

Thank you for the advice.

I will try the options kapitainsky is suggesting.

But it seems that I am indeed limited by the number of file transfers per second rather than bandwidth (as suggested by asdffdsa). I retried on a smaller folder:

Transferred:        2.039 GiB / 2.039 GiB, 100%, 21.425 KiB/s, ETA 0s
Checks:               685 / 685, 100%
Transferred:         2437 / 2437, 100%
Elapsed time:      41m6.0s

41m is about 2460 seconds. That's only about 1 file transferred per second. Most of the files are small in size.

This is what I got looking at the size of the source folder:

rclone size "gdrive:src_dir"  
Total objects: 3.122k (3122)
Total size: 2.603 GiB (2794880872 Byte)

Note that some of the files had already been transferred (685).

if this is a one-time copy job, might

  1. download all the gdrive files as a single zip file to local
  2. uncompress
  3. upload the files to dropbox.

lol, try --magic

If you have lots of small files, I'd suggest slowly only adjusting transfers to help out.

The default is 4, so try 8, 16, 32, 64 and try to find the sweet spot.

Google limits file creation to about 2 per second but you are going the other way. I thikn dropbox is pretty good in that regard.

You want tpslimit 12 though and just add --transfers 8 and bump it up until you find that sweet spot.

Hi,

Just to wrap up this thread. So I am limited by the number of file transfers per second on both ends. It's between 1 and 2 per second. So very slow. I could not find a way around that. There is the option of downloading a zip file from google and uploading the zip file on dropbox. This works, but the obvious caveat is that you would need to unzip the file at some point to access the content.

Thank you for your help!

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.