Too many open files

What is the problem you are having with rclone?

Multiple errors of:

Failed to read config file - using previous config: open /root/.config/rclone/rclone.conf: too many open files

What is your rclone version (output from rclone version)

rclone v1.55.1

  • os/type: linux
  • os/arch: amd64
  • go/version: go1.16.3
  • go/linking: static
  • go/tags: none

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Ubuntu 18.04

Which cloud storage system are you using? (eg Google Drive)


The command you were trying to run (eg rclone copy /tmp remote:tmp)

Using mounts, an example is:

--buffer-size 2G \
--allow-other \
--dir-cache-time 12h \
--log-file /var/log/rclone.log \
--log-level INFO \
--umask 002 \
--rc \
--rc-addr :5583 \
--rc-no-auth \
--cache-dir=/rclone/cache \
--vfs-cache-mode minimal

The rclone config contents with secrets removed.

The config is standard, using crypt

A log from the command with the -vv flag

ERROR : Failed to read config file - using previous config: open /root/.config/rclone/rclone.conf: too many open files

That's just a system issue and not related to rclone.

If you google increase open files on Linux, you get some results:

Thanks @Animosity022 I did find that and updated it to 200k but was still having the same issue. That link has a couple more settings then what I had so will give that a go.

You probably want to figure out what's hammering your system.

I was using the beta version and then reverted back to stable which seemed to reduce the amount of rclone open files, but this might all just be coincidence.

Rclone has only open files based on what the server is doing. You'd want to check out the server and see what's going on as doesn't seem to be a rclone item imo.

I've implemented everything from that site, but still getting the error. A restart of rclone clears it for a while. Any pointers on how to see what is keeping the files open? I've ran:

lsof | grep "rclone" | wc -l

Output is: 34881

What are you running on the server? rclone is only going to have open files based on what is going on via the server generally.

Can you run it with -vv --log-file /tmp/rclone.log and share the full debug log file?

I'd just look outside of rclone to confirm if it's rclone or not.

Check out lsof to see how to get it to list files for example or go manual on it,

Get the process id from - ps aux | grep rclone | grep -v grep (second column). Assume it's 123

Then to check how many files it has open, ls /proc/123/fd | wc -l

Thanks both. I'm running Plex in Docker. I've added debugging to both Rclone and Plex to see if I can narrow it down, just waiting for the error again.

I'm not really sure how to use lsof properly yet, just picked up snippets from this issue [ACD][mount] - mount breaks on too many open connections · Issue #1111 · rclone/rclone · GitHub

This one shows:

lsof | grep "rclone" | wc -l

But then when I run the commands that @SimpleBobster posted it shows 10-20 on each mount..

How many mounts are you running?

Running 9 mounts on different RC ports and configs etc.

That doesn't seem like too many.

If you run lsof and just grep for the process, you get threads as well, which you really don't want.

You'd lsof -p and that will give you actual open files. If you only see 10-20 on lsof output, it's likely not rclone and something else doing it.


That's my system which shows the difference in threads in the first one and actual open files in the second.

You'd want to see how many the user in question has open with something like:

root@gemini:~# lsof -u felix | wc -l
root@gemini:~# lsof -u root | wc -l

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.