Raspberry Pi 2 Turtle Slow Speed

  1. Installed 1.35 arm 32bit on raspbian and ubuntu mate.

  2. speedtest.net-cli speeds on it are 90/70mbit dl-ul

  3. transfers from my windows pc using samba are close to 100mbit

the only problem is rclone.

a) copy function google drive = 10mbit max

b) copy function amazon cd = 10mbit max

any ideas how to fix this? I’m pretty certain the fault is rclone as my cpu is only utilising 20% during transfers.

And of course, I’m getting 200mbps per file using my windows pc.

Can you try downloading a file with SSL on the raspberry PI using wget or curl and see what speed you get.

I suspect it is the crypto speed…

How did you measure that? Note that rclone reports in MBytes/s not MBits/s normally.

yes its 10mbits as i use a network monitor,vnstat --live, to see the real time throughput.

Also when the file is done, it shows 1MBytes/s on rclone.

I have tested with and without crypt. same slow performance.

also tested with multiple parameters like chunk size.

what do i have to change in rclone for it to work fast?

I tested to download using https links thru the browser and indeed there is bottleneck for SSL.

Any fix for it that you know of? How to avoid SSL entirely in rclone.

Edit: Update, downloaded directly from google drive using chromium web browser. Getting good speeds 70-80mbits/s. And yes, i’ve check it is https, so thats not the problem.

I guess there is a problem with rclone arm 32bit i guess. Any beta i can try that fixes this?

Some pictures. Same file.

  1. Rclone Slow Performance : http://imgur.com/a/z4KPA

  2. Direct Download Google Drive : http://imgur.com/a/iQFIW

I think this go SSL performance problems as in this issue: https://github.com/ncw/rclone/issues/1061

One approach to fixing this would be to try integrating: https://godoc.org/github.com/spacemonkeygo/openssl

OMG thanks!

This fixed it. Got it from the link you gave me.

Where do you stash your rclone beta, assuming this is beta. Maybe a better and updated one you released which i don’t know about.

For now i’m satisfied,

  1. google drive : 60-80mbits/s

  2. amazoncd: 20-22mbits/s

Maybe amazon has more security? I don’t know.

Good, I’m glad that helped!

I’ll be building the betas with go 1.8 as soon as it is officially released (should be in a few days!)

That one was a special I did as a test.

Got a similar issue.

Raspberry Pi 3 Model B
Raspbian GNU/Linux 10 (buster) / Linux raspberrypi 4.19.75-v7+
rclone v1.50.2

I tried downloading a file from SFTP remote - the download speed keeps going down endlessly right from the start, and the actual download progress never gets past several megabytes.

I tested the download speed with wget and some random file from the internet - it downloads at maximum speed (what I have from my ISP). Also tested with default sftp utility and the same remote (which I tried with rclone) - same thing, the download goes at maximum speed. So it seems that it's rclone issue.

What's peculiar, I was using rclone on the same Raspberry Pi unit before but with different Linux distribution - LibreELEC - and I did not have this issue, downloads were going at maximum speed.

So I assume there should be something different about Raspbian environment, which causes this issue. Does anyone know what can be done in this regard?

Update 1:

Another thing I noticed: when I press Ctrl + C to cancel the transfer, it takes a really long time till the operation actually stops and I get my prompt back. Feels like it stops a lot of spawned threads or something like that.

Update 2:

Oh well, right after writing about threads, I went to read the rclone documentation to look for a similar option just in case. And indeed, there is such option - setting --multi-thread-streams 1 resolved the issue, downloads go at the maximum speed now.

P.S. At the same time, here's what I see at htop (that's already with --multi-thread-streams 1):

What was the remote server? --multi-thread-streams should work for all servers. Can you try an experiment and see if it does complete? Maybe you could run it on a small file by setting --multi-thread-cutoff 10M and see if it does ever complete?

You are seeing rclone copying multiple files at once with --transfers and reading multiple directories at once with --checkers.

Go uses lightweight threads call goroutines which don't map 1:1 with OS threads - you'll typically get the same number of threads as CPUs more or less (there can be more if go has to do lots of system calls, eg local IO).

1 Like

It was an SFTP remote. But it's all good now, like I said, setting --multi-thread-streams to 1 helped with the issue in my case. And without this option I experience the described problem.

It's a bit weird as I was only copying a single file at that moment ( with --multi-thread-streams 1 being set), that's why I was surprised to see that many threads in htop.

I was just wondering if you knew what the server is?

Indeed! Rclone will be doing hashing the file contents in a different thread, ssh encryption, file io all in different threads probably. It still looks like too many threads actually doing stuff though!

I've checked just now, the server seems to report the following: SSH-2.0-mod_sftp/0.9.9. But when I was using rclone on LibreELEC, I was working with the same server, and I didn't have this issue back then, so I doubt that it's related to the server.

If you would like to and if you have some test servers available, I can test it with those.


Multithread downloads are a relatively new feature introduced in v1.48.0 - 2019-06-15 - maybe you were using an older rclone then...

It isn't impossible that there is an incompatibility which makes multhread download not work. I think mod_sftp is provided by proftpd.