Finding bottleneck in transfers

What is the problem you are having with rclone?

I am trying to discover where the bottleneck is when I run rclone copy. I currently have a 1 gbps uplink a Intel(R) Core(TM) i5-8400 CPU @ 2.80GHz and 16GB DDR4 RAM.
My cpu when running rclone hardly sweats hitting around 10% my RAM is at about 7GB and my network is hardly used. But it all seems to be running all be it very slowly.

What is your rclone version (output from rclone version)

rclone v1.52.0-008-g8774381e-beta

- os/arch: linux/amd64

- go version: go1.14.3

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Ubuntu 20

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy /4TB/ encrypted:/4TB/ --log-level ERROR --max-backlog 9999999 --fast-list --transfers=50 --progress

The rclone config contents with secrets removed.

[encrypted]
type = crypt
remote = backup:encrypted
filename_encryption = off
directory_name_encryption = false
password = 

[backup]
type = drive
scope = drive
service_account_file = /root/.config/rclone/accounts/1.json
team_drive =
chunk_size = 128M

A log from the command with the -vv flag

2020/06/03 13:05:12 DEBUG : rclone: Version "v1.52.0-008-g8774381e-beta" starting with parameters ["rclone" "copy" "/4TB/" "encrypted:/4TB/" "--drive-stop-on-upload-limit" "--drive-service-account-file" "/root/.config/rclone/accounts/1.json" "--max-backlog" "9999999" "--fast-list" "--transfers=50" "--progress" "-vv"]

hello and welcome to the forum,
what makes you think there is a bottle neck?

Hey! Thanks! Just the fact that there is still data left to be transferred - I would therefore expect my bandwidth to be at full utilisation and if that wasn't the case there to be another bottleneck in the RAM or the CPU or I/O but none are being fully utilised!

  • add --progress and see what the upload speeds are?
  • you have --transfers=50 but by default --checkers=8
  • change log level to DEBUG so you can understand what is going on behind the scenes.
  • gdrive can be very slow uploading lots of small files
rclone copy /4TB/ encrypted:/4TB/ --log-level DEBUG  --fast-list --transfers=8 --progress --drive-chunk-size 1G
b0b34b698a6c6f7f09e53065233048a3a6d0f51e.json
Transferred:   	   71.639M / 1.277 GBytes, 5%, 200.780 kBytes/s, ETA 1h45m5s
Checks:             70835 / 70835, 100%

I am getting a lot of:

2020-06-03 13:55:18 DEBUG : pacer: Reducing sleep to 763.261472ms
2020-06-03 13:55:19 DEBUG : pacer: Reducing sleep to 760.005356ms
2020-06-03 13:55:19 DEBUG : pacer: Reducing sleep to 814.512576ms
2020-06-03 13:55:19 DEBUG : pacer: Reducing sleep to 912.345455ms
2020-06-03 13:55:19 DEBUG : pacer: Reducing sleep to 1.000867115s
2020-06-03 13:55:19 DEBUG : pacer: Reducing sleep to 1.060017735s
2020-06-03 13:55:19 DEBUG : pacer: Reducing sleep to 1.111338361s
2020-06-03 13:55:19 DEBUG : pacer: Reducing sleep to 1.083539915s
2020-06-03 13:55:19 DEBUG : pacer: Reducing sleep to 1.042045758s
2020-06-03 13:55:19 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=878044099763, userRateLimitExceeded)
2020-06-03 13:55:19 DEBUG : pacer: Rate limited, increasing sleep to 1.329489994s
2020-06-03 13:55:19 DEBUG : pacer: Reducing sleep to 983.715974ms
2020-06-03 13:55:20 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=878044099763, userRateLimitExceeded)
2020-06-03 13:55:20 DEBUG : pacer: Rate limited, increasing sleep to 1.682467879s
2020-06-03 13:55:21 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=878044099763, userRateLimitExceeded)
2020-06-03 13:55:21 DEBUG : pacer: Rate limited, increasing sleep to 2.368680878s
2020-06-03 13:55:23 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=878044099763, userRateLimitExceeded)
2020-06-03 13:55:23 DEBUG : pacer: Rate limited, increasing sleep to 4.842511624s
2020-06-03 13:55:26 DEBUG : pacer: Reducing sleep to 0s
2020-06-03 13:55:33 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=878044099763, userRateLimitExceeded)
2020-06-03 13:55:33 DEBUG : pacer: Rate limited, increasing sleep to 1.915599099s
2020-06-03 13:55:33 DEBUG : pacer: Reducing sleep to 0s
2020-06-03 13:55:34 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=878044099763, userRateLimitExceeded)
2020-06-03 13:55:34 DEBUG : pacer: Rate limited, increasing sleep to 1.705471844s

this is the reason.

error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota

Oh damn is there a way to increase? Create a new project?

you would need to login to your google account.

I am using a service account!

I am not getting the error anymore just very slow speeds:

Transferred:   	   60.837M / 337.303 MBytes, 18%, 41.823 kBytes/s, ETA 1h52m48s
Checks:             71973 / 71973, 100%
Transferred:         2867 / 12911, 22%

with this command:

rclone /4TB/ encrypted:/4TB/ --log-level DEBUG --fast-list --transfers=12 --checkers=32 --progress --drive-chunk-size 1024M

before each file to be uploaded, rclone has to calculdate the checksum
so that appears to reduce upload speed.

and i am sure you have read this
https://rclone.org/drive/#limitations
"limited to transferring about 2 files per second only"

in your config file you hard coded
chunk_size = 128M
and in your command you have
--drive-chunk-size 1024M

You need to remove transfers and checkers and just use the defaults.

With Google, you can only do about 10 API hits per second and you are basically hammering it with your settings and killing your speed.

ahh yes. Removed from config and I am trying:

copy /4TB/ encrypted:/4TB/ --log-level DEBUG --fast-list --progress --drive-chunk-size 1024M

let's see what happens :slight_smile: thanks for your help guys!
Should I try --drive-chunk-size 3G seeing as I have 16GB of Ram?

1G is a good starting point as making it big isn't always faster.

Let's see what you get first and we can take a look at the logs.

Transferred:   	  640.042M / 762.373 MBytes, 84%, 74.936 kBytes/s, ETA 27m51s
Checks:             73209 / 73209, 100%
Transferred:        15298 / 25310, 60%

still painfully slow!

with

rclone copy /4TB/ encrypted:/4TB/ --log-level DEBUG --fast-list --progress --drive-chunk-size 1024M

You need to share a debug log.

It also looks like you are copying lots of little files and that's limited to about 2-3 files per second.

in your config file, chunk_size = 128M
in your command --drive-chunk-size 1024M

not sure which value takes priority.

I changed that! And yes lots of little files!

With Google Drive, you won't ever get great performance with small files as it's best to zip / compress them up if it makes sense. If not, you can't do much other than bear with the speed as you can only make / upload about 2-3 files per second.

2 Likes