RClone stops uploading temporarily

What is the problem you are having with rclone?

I'm trying to sync a cryptomator encrypted directory with my onedrive for business account given to me by my college. I have 100 Mbps connection, but usually the upload speed I get is around 60 Mbps. While uploading using rclone, the upload speed averages around 2-5 MB/s at times. But I have noticed this issue where rclone stops uploading suddenly and resumes uploading later. I have also noticed this when I tried to upload a directory with rclone's encryption (crypt).

What is your rclone version (output from rclone version)

rclone v1.53.3

  • os/arch: linux/amd64
  • go version: go1.15.5

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Ubuntu 20.04, 64 bits

Which cloud storage system are you using? (eg Google Drive)

Onedrive for business

The command you were trying to run (eg rclone copy /tmp remote:tmp)

This is the command I used to sync the cryptomator encrypted directory.

rclone sync --update --verbose --transfers 30 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s "Vault Encrypted" college:/AVaultEncrypted

The following is the command I used to upload a normal directory with many .jpg files and some .mp4 files.

rclone copy --update --verbose --transfers 5 --order-by size --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s Screenshots/ collegeEncrypted:/DCIM/Screenshots

Also please note that it was almost impossible to upload this directory with encryption without using the --order-by size flag. Before using it, the upload was getting stuck.

The rclone config contents with secrets removed.

[college]
type = onedrive
token = {
    "access_token": |removed|
    "token_type": "Bearer",
    "refresh_token": |removed|
    "expiry": |removed|
}
drive_id = |removed|
drive_type = business

[collegeEncrypted]
type = crypt
remote = college:/ENCR
filename_encryption = standard
directory_name_encryption = true
password = |removed|
password2 = |removed|


A log from the command with the -vv flag

I'm happy to post the logs. However, I am not sure how exactly I should retrieve the logs without exposing any secret details. So please tell me the commands I should use and till when I should let the logs generate. I mean, the upload works well sometimes, and sometimes it doesn't.

You can post the logs anyway you want as without, there isn't much to troubleshoot.

You'd want to review the log and validate you aren't concerned about sharing any information that's in there. Some folks don't want to share file names. Some logging when turned out might have some sensitive information.

Generally, a log with -vv "should" be ok, but you can look through and validate you feel comfortable as well.

Here's a part of the log that I found to be relevant:

2020/12/15 00:22:27 DEBUG : Too many requests. Trying again in 98 seconds.
2020/12/15 00:22:27 DEBUG : pacer: low level retry 1/10 (error activityLimitReached: throttledRequest: The request has been throttled)
2020/12/15 00:22:27 DEBUG : pacer: Rate limited, increasing sleep to 1m38s
2020/12/15 00:22:27 DEBUG : pacer: Reducing sleep to 1m13.5s
2020/12/15 00:22:27 DEBUG : pacer: Reducing sleep to 55.125s
2020/12/15 00:22:27 DEBUG : pacer: Reducing sleep to 41.34375s
2020/12/15 00:22:27 DEBUG : pacer: Reducing sleep to 31.0078125s
2020/12/15 00:22:27 DEBUG : pacer: Reducing sleep to 23.255859375s
2020/12/15 00:22:27 DEBUG : pacer: Reducing sleep to 17.441894531s
2020/12/15 00:22:27 DEBUG : pacer: Reducing sleep to 13.081420898s

As I understand, the error is (error activityLimitReached: throttledRequest: The request has been throttled). But the size of the directory that I am trying to transmit is only 80 GB. It might have a lot of individual files by the way. This article in Microsoft docs says the throttling issue will mostly happen for large transfers of the order of 1TB. But the transfer that I am making is nowhere close to 1 TB.

As @Harry will tell you, MS throttles quite a bit and there isn't much we can do from our side on that.

1 Like

Is there any setting I can use to reduce the throttling? Also, can I contact network administrators of my college to get any more permissions to reduce the throttling? Throttling for even 80 GB upload doesn't seem right to me.

You can limit transfers.
You can limit checkers.
You can limit transactions per seconds --tps-limit

The error is from MS so not much to do as you aren't going to change MS :slight_smile:

Thanks for your reply. This may be a very basic question, what counts as a transaction here? Is it one whole file or a unit of data?

A transfer is related to how many files are copied at the same time.

Checkers are how many processes are checking for files on the other side.

Those all tie back to API hits which depending on what is happening might be a lot per second or very few per second.

Not sure with MS the term transaction applies as they are API hits. I think other providers like B2 track transaction per API hit depending on what the API call is.

An example would be moving say 1G of file would depend on how it's chunked up and might work at 32MB chunks so 1G/32M for API hits or something along those lines.

It has to get a directory listing to put a file on the other side so thats an API hit.

1 Like

As advised in this discussion on this forum, will it help if I get my own client ID and Key? Should I try it?

I don't use MS so I really don't know. I've seen other comments that say it doesn't matter.

May as well try and see what happens would be my take.

Thank you. I will try with getting my own client ID and Key.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.