Rclone transfers keep stalling

I'm trying to get a basic rclone setup but running into issues. When transferring files via rclone on a gigabit connection, the speed will go up to ~40MB/s and then drop to 0 and keep repeating. I've tried both sending to dropbox and gdrive with the same results. Here's a graph (logarithmic scale)

bandwidth graph

Specs: Epyc 7351 16 core, 160GB ram, SSD Mirror OS drive, ZFS raidz storage, ubuntu 18.04, rclone 1.47

I'm currently using no cache or crypt and just trying to copy a 5GB file I created.

Current mount settings, but I've somewhat blindly increased various options seeing if any impact is made

"/usr/bin/rclone" "mount" "dbox01:/rclone/" "/tank/rclone/unencrypted" "--allow-other" "--tpslimit" "6" "--tpslimit-burst" "12" "--dir-cache-time" "96h" "--drive-chunk-size" "32M" "--log-level" "DEBUG" "--dropbox-chunk-size" "150M" "--log-file" "/tank/rclone/rclone-unencrypted.log" "--timeout" "1h" "--umask" "002" "--checkers" "50" "--rc"

Things I've tried:

  • copying from ssd drive instead of zfs pool to rule that out
  • iperf shows 960mbps with no issue
  • increasing dropbox chunk size makes each burst last longer, but then the pit lasts equally longer so avg speeds the same
  • increased checkers and tpslimits with no effect.

Edit: here's a debug log

1 Like

You have a lot of commands mixed in. You are trying to use dropbox from what I can tell. Are you able to rclone copy a file up without an issue?

The log does look like something is getting stopped on the chunked upload. I've tested rclone with copies to dropbox and got a few hundred gig up but that uses almost 35% of the monthly API hits. Depending on what you are trying to do, you may fill that up fast as well.

1 Like

I think this is dropbox's fault. You can only upload one chunk of a file at once and you have to wait until dropbox has processed it before uploading the next.

Other people have reported similar in the past with dropbox.

1 Like

No way to upload multiple chunks at one time?

Are bounties allowed here? I'll pay 100$ bounty for someone who can resolve this situation

It would be something that dropbox needs to change alas. The only way around it is to upload multiple files at once.

1 Like

I have unlimited API calls with Dropbox, any way to upload multiple chunks at once?

No, dropbox API doesn't work like that.

What would be the optimal use of the dropbox API in this case? Or nothing we can do?

Thank you for reopening the thread

Is it possible to do chunking locally and then upload?

Is there a way to tell rclone to encrypt the files at a max size of 150mb so dropbox thinks the files are 150mb instead of the 4gb+ that they really are?

rclone doesn't do anything in terms of breaking files and putting them back together.

So no both questions.

This isn't the solution you're looking for, but maybe you should change providers. Dropbox is a barrier, and if you're willing to pay, other providers might be more suitable to your needs.

What are some other providers which do not have this limit?