Rclone crashes when adding a library inside Google Stream Drive

I'm using a Plex server installed inside a Ubuntu Linux system. I use rclone to sync a remote library on my Google Stream Drive and link it inside Plex. Recently, everytime I try to add the mounted drive into my Plex library, rclone crashed when Plex scans the library for files. It was working well before and nothing is supposed to have changed since then. Here is my rclone.system configuration file:

[Service]
Type=simple
ExecStart=/usr/bin/rclone mount
--config=/root/.config/rclone/rclone.conf
--allow-other
--fast-list
--buffer-size 512M
--dir-cache-time 72h
--drive-chunk-size 1M
--vfs-read-chunk-size-limit off
--vfs-read-chunk-size 128M Plex: /mnt/Plexserv
ExecStop=/bin/fusermount -u /mnt/Plexserv
Restart=always
RestartSec=10

Does anybody else have the same problem and managed to correct it?

I'd guess the crash it out of memory as yo have 512M configured so that uses that per file opened and a plex scan probably blows it up.

What's the memory on your server?

Your buffer size is very large - and your memory usage is going to spike very hard because of it. That memory will be used for every transfer, and Plex by default has a nasty habit of opening a LOT of transfers when it does it's scans. That could result in a memory overload leading to a crash. This seems to be by far the most common reason rclone users report crashes.

Course of action: Monitor or log your memory usage and see if you can witness either abnormally high memory usage or an actual crash as it happens.

Probable solution: lower buffer size (you really don't need that much anyway) and follow guides on how to disable or limit some of Plex's more aggressive file-behaviours. I think Animosity has a big post about his recommended settings.

Unrelated note:
This is a very low value. This also uses memory (on uploads) but such low value will make uploads quite inefficient. I'd recommend you see if you can afford the memory to use at least 64M.

The system has 10GB ram... It may be the cause, I never looked to it. I just used an example.

Changed to this, is it okay?

ExecStart=/usr/bin/rclone mount
--config=/root/.config/rclone/rclone.conf
--allow-other
--fast-list
--buffer-size 256M
--dir-cache-time 72h
--drive-chunk-size 64M
--vfs-read-chunk-size-limit off
--vfs-read-chunk-size 128M Plex: /mnt/Plexserv

Does not do anything on a mount so it can be removed.

256 is still pretty big per file imo on a 10GB system. In general, you can really run defaults unless you find something needed.

Pretty much my only exception is dir-cache-time as that can be a nice big number like you have and allow-other as if you want other users to see the files.

So would you recommend a buffer of 128 instead?

I will remove fast-list

Should be fine. Just watch your memory as Plex does get a little happy opening files. If you see it have memory issues, just tone it down more.

Thanks a lot Animosity. I see you a lot in the forums for some time. You are of a great help to users. I'm now trying an upload and it goes smoothly at 15m/sec. I will try to add the remote library aftey it has finished.

The default buffer is 16MB, and that is usually sufficient for most uses.
If you want to use a 8x larger buffer the question you need to ask yourself is "why do I think I need this?".
There may be edge-cases where it could be appropriate, but don't make the mistake of thinking that setting resource values higher are necessarily better - or just doing it because you saw someone else use it.

To the best of my understanding the buffer size is just a general read/write buffer that smooths out any delays in ingesting or exporting data - for example when hardware or software is busy for a second with doing something else. having a small buffer then helps with keeping the bandwidth utilization from being suboptimal by stalling the whole transfer. The main reason you'd want a bigger buffer is probably if you have really blazing fast network speed like a gigabit connection.

I changed the buffer like you suggested thestigma. I was wondering something. Can I download torrents directly into the drive after this? Will it be possible? I actually receive an error called targeting not permitted when doing so right now with Transmission. It's written in french so I translated it.

Yes, it is perfectly possible.
However, there is a caveat you need to be aware of.

In general, rclone has trouble handling "temporary work files". In other words, files that a program makes that are not finished - and it keeps working on them over time. This becomes a problem because rclone can't know what temporary and what is not.

So you will need to either

  • Use a torrent program that has a "temporary folder" feature. This is a local work-folder for incomplete files - and then when they are done it moves it automatically to the final destination. This is the best solution as the torrent-software can avoid the problem directly in a very clean way. Qbittorrent that I use, as well as many other torrent clients have this. reading back the data for seeing is no problem.
    or
  • Use a more complicated setup for indirect uploading, using Union to make a local "temp upload" folder that mount files get "uploaded" to - and then you can have a script that does the uploading for you, with a lot more control (for example only uploading file that have not been modified the last 10minutes with --min-age 10M )
    Tell me if you want more info on this, because describing this is kind of a topic unto itself.

There are probably other ways to solve the problem too, but these are what come first to my mind. The absolutely best solution is the "temp work folder" way. Check if any of your favorite torrent clients have this option, because that makes it easy :slight_smile:

I actually uses Transmission because I can start a torrent from a remote computer on the same network. I will look on that feature, thanks a lot!

And I just saw that QBitorrent has a web interface, this is wonderful.

I am no super-expert on torrent clients (as there are so many good ones), but web-interface remote-controls are fairly common in the well-developed ones I think. "temp upload" is also not that unusual, but it can be a little hard to find as it's not exactly the sort of big main feature that gets advertised in a feature-list :slight_smile:

You might also just suggest it gets implemented in your favorite client - because in technical terms it is not a very hard function to code. It's just a trigger that moves some files when the torrent is done. Probably more work to add it to the GUI than it is to code :stuck_out_tongue:

It is very very likely that there exists a good client out there that has both "temp workfolder" feature + all the other features you feel you need. It is just difficult for me to make an exact recommendation :slight_smile:

I primarily just just Qbit because it is quite familiar to Utorrent in the design (but is opensource) that I used for a long time before it became infested with adware and junk. It's maybe not THE most advanced client, but when it does everything I currently need and more I just haven't taken the time to go looking for alternatives.

But if you do end finding something really nice (that can run on windows also) then by all means drop me a hint about it so I can keep it in mind :wink:

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.