Slow download from encrypted GDrive

What is the problem you are having with rclone?

Downloading from GDrive is slower then expected.

What is your rclone version (output from rclone version)

rclone v1.52.1

  • os/arch: linux/amd64
  • go version: go1.14.4

Which OS you are using and how many bits (eg Windows 7, 64 bit)

CentOS 7

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone -P sync gdsecret: /srv/daten/Pool_2/ --transfers 2 --user-agent "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36" --drive-chunk-size 128M

The rclone config contents with secrets removed.

[gdrive]
type = drive
scope = drive
token = {"access_token":"secret"}
client_id = secret.apps.googleusercontent.com
client_secret = secret
root_folder_id = secret

[gdrivesecret]
type = crypt
remote = gdrive:NAS
filename_encryption = standard
directory_name_encryption = true
password = secret
password2 = secret

Hello everyone,

now for the first time I need to download everything from my Gdrive. The problem is, that it's not surpassing 70 Mbit/s and I don't know why. Before I updated from 1.47 to the latest version I hat 100 Mbit/s. So now I'm wondering why the downloadspeed is that low :frowning:

Would it be possible to help me?

Thanks in advance!

Cheers,
Gamie

A debug log would help. and transfers=2 is a little low. I'd use the defaults personally.

@calisro Could you then please help me quick on how to get the Log?

And regards Transfers: I increased it to 4, 5, 10 and nothing changed. The overall Transferspeed was 70 Mbit/s

add -vv --logfile output.log

The output from rclone is typically in bytes/ Not bits. Just making sure you know that.

What is your bandwidth on your network? what speed are you expecting? Do you have lots of small files or large files? mix?

Alright,

Yes I'm aware and recalcutated it :slight_smile: But here is a example line:

Transferred:      284.596M / 1.668 TBytes, 0%, 7.338 MBytes/s, ETA 2d18h11m12s

Here you can see the Log and maybe there is a point where I could look:

2020/06/18 15:53:25 DEBUG : rclone: Version "v1.52.1" starting with parameters ["rclone" "-P" "sync" "gdrivesecret:" "/srv/daten/Pool_2/" "--transfers" "2" "-vv" "--log-file" "log.log" "--user-agent" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36" "--drive-chunk-size" "128M"]
2020/06/18 15:53:25 DEBUG : Using config file from "/root/.config/rclone/rclone.conf"
2020/06/18 15:53:26 DEBUG : Beat_Video_Gamie_99bpm.mp3: Size and modification time the same (differ by 0s, within tolerance 1ms)
2020/06/18 15:53:26 DEBUG : Beat_Video_Gamie_99bpm.mp3: Unchanged skipping
[..]
2020/06/18 15:53:33 INFO  : Anime/info.txt: Copied (new)
2020/06/18 15:53:33 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=605156308441, userRateLimitExceeded)
2020/06/18 15:53:33 DEBUG : pacer: Rate limited, increasing sleep to 1.618958058s
2020/06/18 15:53:33 DEBUG : pacer: Reducing sleep to 0s
2020/06/18 15:53:33 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=605156308441, userRateLimitExceeded)
2020/06/18 15:53:33 DEBUG : pacer: Rate limited, increasing sleep to 1.235379897s
2020/06/18 15:53:33 DEBUG : pacer: Reducing sleep to 0s
2020/06/18 15:53:34 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=605156308441, userRateLimitExceeded)
2020/06/18 15:53:34 DEBUG : pacer: Rate limited, increasing sleep to 1.87934276s
2020/06/18 15:53:34 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=605156308441, userRateLimitExceeded)
2020/06/18 15:53:34 DEBUG : pacer: Rate limited, increasing sleep to 2.557996988s
2020/06/18 15:53:34 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=605156308441, userRateLimitExceeded)
2020/06/18 15:53:34 DEBUG : pacer: Rate limited, increasing sleep to 4.717463646s
2020/06/18 15:53:34 DEBUG : pacer: Reducing sleep to 0s
[..]
2020/06/18 15:53:37 DEBUG : VHDD/Server Files/ut3/UT3-linux-server-02202008-patch_1_2.bin: Size and modification time the same (differ by 0s, within tolerance 1ms)
2020/06/18 15:53:37 DEBUG : VHDD/Server Files/ut3/UT3-linux-server-04292009-patch_1_3.bin: Size and modification time the same (differ by 0s, within tolerance 1ms)
2020/06/18 15:53:37 DEBUG : VHDD/Server Files/ut3/UT3-linux-server-02202008-patch_1_2.bin: Unchanged skipping
2020/06/18 15:53:37 DEBUG : VHDD/Server Files/ut3/UT3-linux-server-04292009-patch_1_3.bin: Unchanged skipping
2020/06/18 15:53:37 DEBUG : VHDD/Server Files/ut3/UT3-linux-server-12172007.bin: Size and modification time the same (differ by 0s, within tolerance 1ms)
2020/06/18 15:53:37 DEBUG : VHDD/Server Files/ut3/UT3-linux-server-12172007.bin: Unchanged skipping
2020/06/18 15:53:40 DEBUG : VHDD/Server Files/CS1/linux64/steamclient.so: Size and modification time the same (differ by 0s, within tolerance 1ms)
2020/06/18 15:53:40 DEBUG : VHDD/Server Files/CS1/linux64/steamclient.so: Unchanged skipping
2020/06/18 15:53:41 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=605156308441, userRateLimitExceeded)
2020/06/18 15:53:41 DEBUG : pacer: Rate limited, increasing sleep to 1.96161409s
2020/06/18 15:53:41 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=605156308441, userRateLimitExceeded)
2020/06/18 15:53:41 DEBUG : pacer: Rate limited, increasing sleep to 2.826705787s
2020/06/18 15:53:42 DEBUG : pacer: Reducing sleep to 0s

What Limit could be exceeded?

Regards Network: I have a Gigabit Downlink from Vodafone, and other applications from Google reach that speed just fine, like Web Drive.

Cheers,
Gamie

its normal to have some of those message depending but are you sure you're using your own client ID?

https://rclone.org/drive/#making-your-own-client-id

I see it in your config but that doesn't neccesariy mean it was authorized after it was added. You should check your console and make sure you're seeing traffic from it.

https://console.developers.google.com/apis/dashboard

Also this:

What is your bandwidth on your network? what speed are you expecting? Do you have lots of small files or large files? mix?

@calisro

Sorry: forgot to answer all the qestions.

So My networkspeed is Full Gigabit, aswell my ISP, I have a gigabit Connection from Vodafone and is working fine. For example via Web I reach 900 Mbit/s without any Problems :slight_smile:

It's all big files between 500 MB and 50 GB. And yes, my API is getting used:

Cheers,
Gamie

can you pick one very large file and run this? post both the output and the full log. you can use 3 backticks to make it show better before and after.

root@s163042:~# rclone copy robgs-cryptp:prexxxx.tar.gz . -P --drive-chunk-size 128M --log-file out -vv
Transferred: 1.598G / 2.797 GBytes, 57%, 95.376 MBytes/s, ETA 12s
Transferred: 0 / 1, 0%
Elapsed time: 17.1s
Transferring:

  • xxxx…-xx-xxx:xxxx.tar.gz: 57% /2.797G, 95.277M/s, 12s^C

haha just making sure. I've seen this mistake many times. :slight_smile:

Heyho,

It's finished :slight_smile:

Transferred:       25.377G / 25.377 GBytes, 100%, 10.359 MBytes/s, ETA 0s
Transferred:            1 / 1, 100%
Elapsed time:     41m48.5s

2020/06/18 16:34:50 DEBUG : rclone: Version "v1.52.1" starting with parameters ["rclone" "copy" "gdrivesecret:secret/secret.tar.gz" "/srv/daten/Pool_2/" "-P" "--drive-chunk-size" "128M" "--log-file" "out" "-vv"]
2020/06/18 16:34:50 DEBUG : Using config file from "/root/.config/rclone/rclone.conf"
2020/06/18 16:34:53 DEBUG : secret.tar.bz: Need to transfer - File not found at Destination
2020/06/18 16:34:54 DEBUG : secret.tar.bz: Starting multi-thread copy with 4 parts of size 6.344G
2020/06/18 16:34:54 DEBUG : secret.tar.bz: multi-thread copy: stream 4/4 (20436615168-27248752483) size 6.344G starting
2020/06/18 16:34:54 DEBUG : secret.tar.bz: multi-thread copy: stream 2/4 (6812205056-13624410112) size 6.344G starting
2020/06/18 16:34:54 DEBUG : secret.tar.bz: multi-thread copy: stream 1/4 (0-6812205056) size 6.344G starting
2020/06/18 16:34:54 DEBUG : secret.tar.bz: multi-thread copy: stream 3/4 (13624410112-20436615168) size 6.344G starting
2020/06/18 16:45:36 DEBUG : rclone: Version "v1.52.1" starting with parameters ["rclone" "copy" "gdrivesecret:secret/secret.tar.gz" "/srv/daten/Pool_2/" "-P" "--drive-chunk-size" "128M" "--log-file" "out" "-vv"]
2020/06/18 16:45:36 DEBUG : Using config file from "/root/.config/rclone/rclone.conf"
2020/06/18 16:45:39 DEBUG : secret.tar.bz: Sizes differ (src 27248752483 vs dst 20807417856)
2020/06/18 16:45:40 DEBUG : secret.tar.bz: Starting multi-thread copy with 4 parts of size 6.344G
2020/06/18 16:45:40 DEBUG : secret.tar.bz: multi-thread copy: stream 4/4 (20436615168-27248752483) size 6.344G starting
2020/06/18 16:45:40 DEBUG : secret.tar.bz: multi-thread copy: stream 2/4 (6812205056-13624410112) size 6.344G starting
2020/06/18 16:45:40 DEBUG : secret.tar.bz: multi-thread copy: stream 1/4 (0-6812205056) size 6.344G starting
2020/06/18 16:45:40 DEBUG : secret.tar.bz: multi-thread copy: stream 3/4 (13624410112-20436615168) size 6.344G starting
2020/06/18 17:26:04 DEBUG : rclone: Version "v1.52.1" starting with parameters ["rclone" "copy" "gdrivesecret:secret/secret.tar.gz" "/srv/daten/Pool_2/" "-P" "--drive-chunk-size" "128M" "--log-file" "out" "-vv"]
2020/06/18 17:26:04 DEBUG : Using config file from "/root/.config/rclone/rclone.conf"
2020/06/18 17:26:04 DEBUG : gdrive: Loaded invalid token from config file - ignoring
2020/06/18 17:26:04 DEBUG : Keeping previous permissions for config file: -rw-rw-r--
2020/06/18 17:26:04 DEBUG : gdrive: Saved new token in config file
2020/06/18 17:26:08 DEBUG : secret.tar.bz: Sizes differ (src 27248752483 vs dst 20444741632)
2020/06/18 17:26:08 DEBUG : secret.tar.bz: Starting multi-thread copy with 4 parts of size 6.344G
2020/06/18 17:26:08 DEBUG : secret.tar.bz: multi-thread copy: stream 4/4 (20436615168-27248752483) size 6.344G starting
2020/06/18 17:26:08 DEBUG : secret.tar.bz: multi-thread copy: stream 2/4 (6812205056-13624410112) size 6.344G starting
2020/06/18 17:26:08 DEBUG : secret.tar.bz: multi-thread copy: stream 3/4 (13624410112-20436615168) size 6.344G starting
2020/06/18 17:26:08 DEBUG : secret.tar.bz: multi-thread copy: stream 1/4 (0-6812205056) size 6.344G starting
2020/06/18 17:43:47 DEBUG : secret.tar.bz: multi-thread copy: stream 2/4 (6812205056-13624410112) size 6.344G finished
2020/06/18 18:06:55 DEBUG : secret.tar.bz: multi-thread copy: stream 4/4 (20436615168-27248752483) size 6.344G finished
2020/06/18 18:06:59 DEBUG : secret.tar.bz: multi-thread copy: stream 3/4 (13624410112-20436615168) size 6.344G finished
2020/06/18 18:07:56 DEBUG : secret.tar.bz: multi-thread copy: stream 1/4 (0-6812205056) size 6.344G finished
2020/06/18 18:07:56 DEBUG : secret.tar.bz: Finished multi-thread copy with 4 parts of size 6.344G
2020/06/18 18:07:56 INFO  : secret.tar.bz: Multi-thread Copied (replaced existing)
2020/06/18 18:07:56 INFO  :
Transferred:       25.377G / 25.377 GBytes, 100%, 10.359 MBytes/s, ETA 0s
Transferred:            1 / 1, 100%
Elapsed time:     41m48.5s

Ok Guys,

forget it. It's not rclone, its not Google. The connection between my router and my switch where NAS was connect was not Gigabit anymore, it was throtteled to 100 MBit/s :weary: ... NO suprise it was that slow ...

After fixing that, and increasing the transfers I'm at 780 Mbit/s now.

Thanks for all the help!

2 Likes

Argh! The other thing is when something turns on flow control - that is a great way of destroying your layer 2 network speed. I've spend ages diagnosing this twice - once at work and again at home!

:slight_smile:

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.