Slow upload speed

Before I start I will say that I’ve looked through recent topics on here and I can’t see anyone else with a similar problem, so I’m not currently thinking that rclone is the culprit. However, I don’t know what next steps to take nor where else I can ask for help.

I have a GSuite business account which I use for unlimited cloud storage. I use rclone to upload and download the data from the cloud. My home broadband speed is 100MB up and download and I used to reliably be able to get 10MB/s when uploading to my cloud account. However, for the past few weeks I have been getting unreliable and slow speeds when uploading. Sometimes I might get 2MB/s, sometimes 250KB/s and occasionally it might go up to 7MB/s. Occasionally I’ll have a slow upload and if I end it and restart it then it will start uploading quicker, but this doesn’t work every time. For example here are some outputs of a file I recently uploaded:

A slow upload:

2019/02/19 17:59:15 INFO  : 
Transferred:   	   52.570M / 1.039 GBytes, 5%, 294.901 kBytes/s, ETA 58m30s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:      3m2.5s

Restarting the same upload a few minutes later and it increasing in speed:

2019/02/19 18:22:17 INFO  : 
Transferred:   	  805.324M / 1.039 GBytes, 76%, 6.566 MBytes/s, ETA 39s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:      2m2.6s

Today I have contacted both Google support and my ISP and both of them have said that there are no limits for speed or usage placed on their service and it isn’t them causing the slowdown. I have tried updating to the latest version of rclone, tried a different computer and reset my router using the reset button but none of this has helped. I’ve tried at different times of the day thinking that maybe the slowdown is at peak time, but there doesn’t seem to be any relation between the time of day and the speed.

Does anyone have any advice on what steps I could take next to try to find a solution to the issue? Thanks in advance.

What does ‘rclone version’ show?

Did you make your own key /client iD?

What’s the command you are running? Can you run the same command can capture the logs -vv on the command line?

What’s your actual internet rated at as 100MB doesn’t make sense as that’s like 800 Mb/s?

Apologies, my internet speed is 100Mb (megabit). When I run a check on I get results I would expect to see:

I have ran the same command using -vv and this is the output. Before today I was using rclone 1.44 but while trying to find a solution I updated to 1.46:

Are you using your own client ID?

I don’t think that I am. Do you know how I could check?

Scroll up a bit and hit the link I posted on making your own client ID :slight_smile:

Thanks. I have just checked and I was using my own client ID. Just to be sure I made a new one following those instructions and I am still getting the same slow speeds as before. I am also now getting lines such as:

2019/02/20 15:39:13 DEBUG : pacer: Rate limited, sleeping for 8.083146174s (4 consecutive low level retries)
2019/02/20 15:39:13 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2019/02/20 15:39:13 DEBUG : pacer: Rate limited, sleeping for 16.164273821s (5 consecutive low level retries)
2019/02/20 15:39:13 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)

Which I wasn’t getting before, so perhaps I have set it up incorrectly and it is using the non-custom client ID.

So if you log into the console, you can see hits going on?

Those errors usually mean it isn’t setup as you are seeing rate limiting errors on a single transfer leading me to believe you are using the default (over subscribed) rclone key.

When you create the key, you need to follow the steps and run through rclone config to add it.

Also, I think you seem to be using the cache backend for uploads? I’d just upload to your crypted drive and not use the cache backend as that’s just added overhead (the speed / slow is from the 403s though).

I’ve just ran through the setup again and realised where had I gone wrong. When I was copying the new client ID & secret from the google website it had a blank character at the start and so when I copied this into the console it set the ID and secret to nothing. I’ve now got it set up correctly using a new client ID & secret, although I was using a custom one before that too:


I’m fairly sure I’m not using a cache remote, it is just a crypt remote.

Oh I misread the log looking back at it at now.

My move is pretty simple as you want to limit the transfers and checkers as google has some hard limits on transactions per second. I reduce mine to 3/3 as I run a mount as well and want to leave some headway.

/usr/bin/rclone move /data/local/ gcrypt: -P --checkers 3 --log-file /opt/rclone/logs/upload.log -v --tpslimit 3 --transfers 3 --drive-chunk-size 32M --exclude-from /opt/rclone/scripts/excludes --delete-empty-src-dirs

The drive-chunk-size helps with upload as from testing 32M was a sweet spot for me. Example upload as I usually do a few file at time.

Transferred:   	    5.829G / 5.829 GBytes, 100%, 53.764 MBytes/s, ETA 0s
Errors:                 0
Checks:                 4 / 4, 100%
Transferred:            4 / 4, 100%
Elapsed time:       1m51s

I have this same slowness issue and I AM using my own client_id. The max I can get uploading to my gsuite business account is 1500 kbps

would be interesting if you try from a different server/location. Perhaps spin up a google compute micro and see for free. Would be interesting to see if its your account or if its network or peering.

What command are you using? Big files or little files? Can you share the logs?

1 Like

Hey folks, I have not tried it form another location, but I have tried over 2 different VPNs… one of them being my work’s so I’m sure that Google doesn’t know that it is a “VPN provider.” I get the same speeds, sometimes even slower… like right now it’s running at a whopping 700k. I could share logs, I just need to know what you’d like to see. Until then, this is the command I’m using:

rclone -vvP --bwlimit=15.0m --transfers=4 --checkers=10 --tpslimit=10 --drive-chunk-size=64M copy /mnt/raid1/cisco crypt:cisco

The weird thing is that when I use the google gsuite DFS application, it goes 5 times faster… but I can’t (easily) use the crypt feature that way. I’m assuming the DFS app uses some undocumented API and just leaves us out in the cold.

The whole log is the best.

-v? -vv? -vvP? Is there anything else I should turn on or off?

You can start with just -vv as that should be fine.

How long should I let it run to be of use? I’m guessing I should output to a log file instead of just copying it from the screen? I just restarted it and now I’m getting a screaming 200k :slight_smile:

Yes, that would be good. You can use --log-file /some/file.log

Here ya go, I appreciate any info you can give me. Lots of stuff in there, I thought about cutting out all of the " I checked it and it doesn’t need to upload any more" but left it in. I’m putting it on dropbox because I’d like to be able to remove it later. TIA!