Sync GDrive to Dropbox > fatal error: runtime: out of memory


I try to run the following cronjob with RClone 1.42 to sync a GDrive (only about 50 files) to a Dropbox Crypt target:

`rclone sync gdrive: dropbox-crypt:/data --backup-dir dropbox-crypt:/backup --tpslimit 10 --low-level-retries 10 -v --log-file=sync.log`

But there are only a few files copied before the out of memory error happens:

Is there any hint in the stack trace why this happens?

Are you running on a low memory machine?

Note that you might want to decrease –dropbox-chunk-size to save memory, or decrease the number of --transfers (which is 4 by default) or decrease --buffer-size (you can set buffer-size to 0 if you want).

So as you have it at the moment, rclone will use 4 (--transfers) * (48 (--dropbox-chunk-size) + 16 (--buffer-size)) = 256MB of RAM for buffering.

I did some further tests and found out the command was running with softlimit limited to 300MB. I have reduced the number of --transfers. Now rclone is running smoothly :slight_smile:

Thanks for your great work!

Ah ha! That makes sense. Glad you got it going.

Are you running on a low memory machine?

I run on a low memory (1000mb) machine, but it has a 1GiB connection. I’m always maxed out on memory.

Currently my setup is this:
rclone copy -v --bwlimit 8M --checkers 2 --transfers 2 --buffer-size 0 --http-url :http:

I’m wondering if I could have a better balance of speed and memory to optimize this transfer. My 8M bwlimit is for staying under GDrive’s 750GB/day limit (and allow me to upload other stuff on my own apart from the script).
I only set the buffer size to 0 and checkers/transfers down to 2 from 3 after reading this post.

Also, I’m mirroring another repository that has some large files in it (double digit GB). How will this setup handle those?

Should I have opened a separate thread for this instead of posting here?


Please open a new thread as this 10 months old.