Slow upload with gdrive (max 6 Mbytes/s)

Hello

#### What is the problem you are having with rclone?
I have 1Gbps broadband (up and down stream) but when I upload on gdrive, I'm stuck at 6Mbytes max.
I'm using a service account file, so I believe my client id is set.
I'm using rclone from official docker repo (rclone/rclone)

Transferred: 196.555M / 10.355 TBytes, 0%, 5.691 MBytes/s, ETA 3w1d2h1m53s
Errors: 0
Checks: 110 / 110, 100%
Transferred: 2 / 2360, 0%
Elapsed time: 34.5s
Transferring:

  • Appleseed Ex Machina (…2007) Bluray-1080p.mkv: 0% /6.537G, 1.745M/s, 1h3m20s
  • Assassin's Creed (2016…sin's Creed (2016).mp4: 2% /1.768G, 1.183M/s, 24m56s
  • Asterix & Obelix Missi…(2002) Bluray-480p.mkv: 2% /1.366G, 1.607M/s, 14m5s
  • Asterix Conquers Ameri…1994) Bluray-1080p.mkv: 1% /2.506G, 1.516M/s, 27m38

#### What is your rclone version (output from rclone version)
rclone v1.50.2

  • os/arch: linux/amd64
  • go version: go1.13.4

#### Which OS you are using and how many bits (eg Windows 7, 64 bit)
Ubuntu 18.04.3 LTS
Linux NUC2 4.15.0-64-generic #73-Ubuntu SMP Thu Sep 12 13:16:13 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux

#### Which cloud storage system are you using? (eg Google Drive)
gdrive

#### The command you were trying to run (eg rclone copy /tmp remote:tmp)
rclone sync '/NAS/Media/_Movies/' gcrypt:/1 -v --drive-impersonate XXX@XXX.com -P --log-file /logs/sync_movies.log --fast-list --drive-chunk-size 32M

#### A log from the command with the -vv flag (eg output from rclone -vv copy /tmp remote:tmp)
2020/01/03 12:51:15 DEBUG : 9t7qai28gtuurqi9tobjdggv9udgpbklracmcsqeq8m4g9sojmc0/2me1bpfhgsst86qo1k1gg67qkv6p9t6brjgp7gvvu87f30e92780: Sending chunk 536870912 length 33554432

2020/01/03 12:51:21 DEBUG : 1ueedd8tk7o0eu75hhtbskjabop14soovc3r6fi23bdd98ok0qqevaeaetpi4qnnmk1gj7b8eruq2/ulv7cqecl5c5gcticl03vvd7rac2dob5vm8cqnkb4j0morehi5h9vjikacuif305qvlu5148g16vp7tfuj0uvm277mjbvkiuarubd78: Sending chunk 536870912 length 33554432

2020/01/03 12:51:22 DEBUG : Cache remote gcache:crypt/ed1pst008fhj63c1pijln8l510: starting cleanup

2020/01/03 12:51:22 DEBUG : Google drive root 'backup/crypt/ed1pst008fhj63c1pijln8l510': Checking for changes on remote

2020/01/03 12:51:23 DEBUG : Cache remote gcache:crypt/ed1pst008fhj63c1pijln8l510: starting cleanup

2020/01/03 12:51:23 DEBUG : Google drive root 'backup/crypt/ed1pst008fhj63c1pijln8l510': Checking for changes on remote

2020/01/03 12:51:27 DEBUG : idjgs1c4k66so4514g3e0om4ec2o8s405b40bmjju174tria0kcg/imod12hr1jukefusdff9l7a48k525o6lmiobei2j4175ioobg7b9krat3rtuqfu3mtej0ct4p0f4c: Sending chunk 570425344 length 33554432

2020/01/03 12:51:29 DEBUG : vr7lu3rg899ocujh93l0fk2ardqpj3kgqlqg5rp7hsjs7i5n86k0/in46kv1nuhh39fjcj97c7i2p3ndejo8850jh7kv396cbbnr34915de6vlver582f2oldsv8l74n7k: Sending chunk 570425344 length 33554432

2020/01/03 12:51:32 DEBUG : 9t7qai28gtuurqi9tobjdggv9udgpbklracmcsqeq8m4g9sojmc0/2me1bpfhgsst86qo1k1gg67qkv6p9t6brjgp7gvvu87f30e92780: Sending chunk 570425344 length 33554432

2020/01/03 12:51:44 DEBUG : 1ueedd8tk7o0eu75hhtbskjabop14soovc3r6fi23bdd98ok0qqevaeaetpi4qnnmk1gj7b8eruq2/ulv7cqecl5c5gcticl03vvd7rac2dob5vm8cqnkb4j0morehi5h9vjikacuif305qvlu5148g16vp7tfuj0uvm277mjbvkiuarubd78: Sending chunk 570425344 length 33554432

Thank you

You can try increasing --drive-chunk-size that may improve performance a bit.

You might get more performance by increasing --transfers.

Is the NUC running out of CPU when rclone is running? How about RAM?

What is your network connectivity to www.googleapis.com like (ping time etc)?

ping is excellent:

PING www.googleapis.com (216.58.200.10) 56(84) bytes of data.
64 bytes from hkg12s11-in-f10.1e100.net (216.58.200.10): icmp_seq=1 ttl=55 time=3.49 ms
64 bytes from hkg12s11-in-f10.1e100.net (216.58.200.10): icmp_seq=2 ttl=55 time=3.29 ms
64 bytes from hkg12s11-in-f10.1e100.net (216.58.200.10): icmp_seq=3 ttl=55 time=3.55 ms
64 bytes from hkg12s11-in-f10.1e100.net (216.58.200.10): icmp_seq=4 ttl=55 time=3.32 ms

when uploading to gdrive via gdrive UI, speed is very fast.

I increased chunk size to 128M and set transfers to 10, but i'm still cap below 6Mbytes/s
when do this, htop return 26% usage for CPU and 3.6Go/7.38Go for RAM

At 6Mbytes/s, I'm not far from the daily limit of around 8.5M, but I would like to reach it to minimise my sync time of 15T data...

Thank you

well, it seems my ISP was throttling my line, i contacted them and they "reset" my broadband router and now I can upload at 50mb/s, much better

Ha!

I wonder what they did? Whether it was just a reset of the router or some QoS reset or...

Annoyingly this uses a different private API with different endpoints to the one rclone uses.

they told me they "remotely reset" my fiber router, which they did.
But I believe they simply unbridle my broadband. I guess they limit the 1000M broadband to all customers, and only make it run full speed if customer raise a complaint...

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.