File-copy stalls at ~600MB copied

Hi RCloners.

I have made a huge copy from my server, to an external dedicated server.
I have used the crypt feature, and transfered the files via FTP to the server.
Everything works perfect.

Now I want to copy all the data back to my server, and all the times, the transferes stops at about 600 MB of copyed data.

I have tryed to copy all the data to a local folder on the external dedicated server, but with the same resoults.

I am using the latest BETA-version, and have tryed with the stable-version also.

Note:
Some of the data was also copyed to google drive, and I had the same problems with the stable version, and then upgraded to the latest BETA, whivh solved the problems on my windows machine.

On my external dedicated server I used this:

rclone copy -P "source" "destination"


rclone copy -P xxx:iso/ /home/Xxx/
Transferred: 646.250M / 53.871 GBytes, 1%, 541.761 kBytes/s, ETA 1d4h37m25s
Transferred: 0 / 46, 0%
Elapsed time: 20m21.4s
Transferring:

  •                   xxx.zip:  0% /592.502M, 0/s, -
    
  •               xxx.iso:  0% /794.062M, 0/s, -
    
  •          xxxx.iso: 50% /646.250M, 0/s, 0s
    
  •    xxxx.iso: 50% /646.250M, 0/s, 0s
    

Any idea, how to fix this issue?

I have tryed to search the forum, but couldn't find any results, as I don't know what to look for?

All files stopps at 50% download?

What version are you running?

Can you share a debug log?

Basically the info in the question template filled out.

I found the "concurrency = 5" in my config-file for the FTP-access...

Have tryed to delete this line, and for now it seems like it works.

I'll post a status later. :blush:

By removing "concurrency = 5" in the config-file, it solved the problem. :slightly_smiling_face:

Sorry for the "noise" in this forum.

Thank you so much for this wondeful software! It has solved all my "encrypt-before-sending-to-cloud" issues.

Happy new year to all of you!

BR Jacob

That removes the limit and sets it to unlimited.

I think if you were to run with -vv you'd see it is doing multithread downloads and that is causing the problem - it will use 4 connections for each file.