Rclone slow to move files to mounted remote

Hi all,

I've been facing issues lately in getting rclone to work properly with rTorrent.

After I downloaded a torrent, I tried to move a folder (8.8GB) to the gdrive remote mounted at /mnt/media. It was painfully slow and I had to restart the machine because it ran into errors after I tried to play a movie from Plex.

media.log
2019/10/12 13:47:52 NOTICE: Serving remote control on http://127.0.0.1:5572/
2019/10/12 14:45:51 INFO  : Movies/Blu-Ray/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/RARBG.txt: Copied (new)
2019/10/12 14:45:55 INFO  : Movies/Blu-Ray/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/RARBG_DO_NOT_MIRROR.exe: Copied (new)
2019/10/12 14:46:01 INFO  : Movies/Blu-Ray/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/Subs/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS.idx: Copied (new)
2019/10/12 14:46:07 INFO  : Movies/Blu-Ray/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/Subs/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS.sub: Copied (new)
2019/10/12 14:46:26 INFO  : Cleaned the cache: objects 10 (was 10), total size 127.416M (was 0)
2019/10/12 14:47:24 INFO  : Cleaned the cache: objects 10 (was 10), total size 337.920M (was 127.416M)
2019/10/12 14:48:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 816.291M (was 337.920M)
2019/10/12 14:49:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 1.052G (was 816.291M)
2019/10/12 14:50:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 1.275G (was 1.052G)
2019/10/12 14:51:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 1.544G (was 1.275G)
2019/10/12 14:52:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 1.808G (was 1.544G)
2019/10/12 14:53:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 2.091G (was 1.808G)
2019/10/12 14:54:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 2.224G (was 2.091G)
2019/10/12 14:55:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 2.470G (was 2.224G)
2019/10/12 14:56:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 2.670G (was 2.470G)
2019/10/12 14:57:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 2.857G (was 2.670G)
2019/10/12 14:58:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 3.092G (was 2.857G)
2019/10/12 14:59:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 3.243G (was 3.092G)
2019/10/12 15:00:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 3.438G (was 3.243G)
2019/10/12 15:01:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 3.628G (was 3.438G)
2019/10/12 15:02:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 3.795G (was 3.628G)
2019/10/12 15:03:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 3.943G (was 3.795G)
2019/10/12 15:04:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 4.187G (was 3.943G)
2019/10/12 15:05:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 4.246G (was 4.187G)
2019/10/12 15:06:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 4.323G (was 4.246G)
2019/10/12 15:07:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 4.371G (was 4.323G)
2019/10/12 15:08:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 4.495G (was 4.371G)
2019/10/12 15:09:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 4.557G (was 4.495G)
2019/10/12 15:10:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 4.610G (was 4.557G)
2019/10/12 15:11:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 4.646G (was 4.610G)
2019/10/12 15:12:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 4.705G (was 4.646G)
2019/10/12 15:13:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 4.751G (was 4.705G)
2019/10/12 15:14:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 4.784G (was 4.751G)
2019/10/12 15:15:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 4.853G (was 4.784G)
2019/10/12 15:16:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 4.918G (was 4.853G)
2019/10/12 15:17:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 4.953G (was 4.918G)
2019/10/12 15:18:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.013G (was 4.953G)
2019/10/12 15:19:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.056G (was 5.013G)
2019/10/12 15:20:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.113G (was 5.056G)
2019/10/12 15:21:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.168G (was 5.113G)
2019/10/12 15:22:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.225G (was 5.168G)
2019/10/12 15:23:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.282G (was 5.225G)
2019/10/12 15:24:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.352G (was 5.282G)
2019/10/12 15:25:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.469G (was 5.352G)
2019/10/12 15:26:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.553G (was 5.469G)
2019/10/12 15:27:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.632G (was 5.553G)
2019/10/12 15:28:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.767G (was 5.632G)
2019/10/12 15:29:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.823G (was 5.767G)
2019/10/12 15:30:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.876G (was 5.823G)
2019/10/12 15:31:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.996G (was 5.876G)
2019/10/12 15:32:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.102G (was 5.996G)
2019/10/12 15:33:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.204G (was 6.102G)
2019/10/12 15:34:23 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.271G (was 6.204G)
2019/10/12 15:35:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.292G (was 6.271G)
2019/10/12 15:36:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.383G (was 6.292G)
2019/10/12 15:37:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.443G (was 6.383G)
2019/10/12 15:38:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.496G (was 6.443G)
2019/10/12 15:39:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.547G (was 6.496G)
2019/10/12 15:40:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.652G (was 6.547G)
2019/10/12 15:41:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.675G (was 6.652G)
2019/10/12 15:42:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.728G (was 6.675G)
2019/10/12 15:43:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.812G (was 6.728G)
2019/10/12 15:44:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.874G (was 6.812G)
2019/10/12 15:45:22 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.879G (was 6.874G)
2019/10/12 15:46:23 INFO  : Movies/Blu-Ray/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/RARBG_DO_NOT_MIRROR.exe: Removed from cache
2019/10/12 15:46:23 INFO  : Movies/Blu-Ray/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/Subs/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS.idx: Removed from cache
2019/10/12 15:46:23 INFO  : Movies/Blu-Ray/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/Subs/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS.sub: Removed from cache
2019/10/12 15:46:23 INFO  : Movies/Blu-Ray/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/RARBG.txt: Removed from cache
2019/10/12 15:46:23 INFO  : Cleaned the cache: objects 5 (was 10), total size 6.955G (was 6.879G)
2019/10/12 15:47:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.006G (was 6.955G)
2019/10/12 15:48:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.018G (was 7.006G)
2019/10/12 15:49:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.105G (was 7.018G)
2019/10/12 15:50:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.138G (was 7.105G)
2019/10/12 15:51:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.183G (was 7.138G)
2019/10/12 15:52:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.211G (was 7.183G)
2019/10/12 15:53:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.295G (was 7.211G)
2019/10/12 15:54:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.350G (was 7.295G)
2019/10/12 15:55:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.374G (was 7.350G)
2019/10/12 15:56:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.429G (was 7.374G)
2019/10/12 15:57:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.448G (was 7.429G)
2019/10/12 15:58:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.569G (was 7.448G)
2019/10/12 15:59:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.628G (was 7.569G)
2019/10/12 16:00:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.712G (was 7.628G)
2019/10/12 16:01:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.765G (was 7.712G)
2019/10/12 16:02:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.826G (was 7.765G)
2019/10/12 16:03:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.852G (was 7.826G)
2019/10/12 16:04:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 7.895G (was 7.852G)
2019/10/12 16:05:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 8.000G (was 7.895G)
2019/10/12 16:06:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 8.008G (was 8.000G)
2019/10/12 16:07:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 8.100G (was 8.008G)
2019/10/12 16:08:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 8.140G (was 8.100G)
2019/10/12 16:09:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 8.145G (was 8.140G)
2019/10/12 16:10:28 INFO  : Cleaned the cache: objects 5 (was 5), total size 8.195G (was 8.145G)
2019/10/12 16:11:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 8.209G (was 8.195G)
2019/10/12 16:12:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 8.224G (was 8.209G)
2019/10/12 16:13:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 8.275G (was 8.224G)
2019/10/12 16:14:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 8.299G (was 8.275G)
2019/10/12 16:15:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 8.323G (was 8.299G)
2019/10/12 16:16:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 8.323G (was 8.323G)
2019/10/12 16:17:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 8.323G (was 8.323G)
2019/10/12 16:18:22 INFO  : Cleaned the cache: objects 5 (was 5), total size 8.323G (was 8.323G)
2019/10/12 16:18:52 Fatal error: failed to umount FUSE fs: exit status 1: fusermount: failed to unmount /mnt/media: Device or resource busy
2019/10/12 16:19:25 NOTICE: Serving remote control on http://127.0.0.1:5572/
2019/10/12 16:19:40 INFO  : Movies/Blu-Ray/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS.mkv: Removed from cache
2019/10/12 16:19:40 INFO  : Cleaned the cache: objects 0 (was 0), total size 8.323G (was 0)
2019/10/12 16:20:30 INFO  : Movies/Blu-Ray/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/RARBG.txt: Removed from cache
2019/10/12 16:20:31 INFO  : Movies/Blu-Ray/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/RARBG_DO_NOT_MIRROR.exe: Removed from cache
2019/10/12 16:20:32 INFO  : Movies/Blu-Ray/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/Subs/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS.idx: Removed from cache
2019/10/12 16:20:33 INFO  : Movies/Blu-Ray/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/Subs/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS.sub: Removed from cache

Looks from the log like the big mkv media file maybe didn't make it through before you shut it down. Is that correct? Is it still in your write-cache?

I can't really see much from this basic log except that it was slow.

  • What speed did you get in practice?
  • What speed can your bandwidth handle in theory?
  • Are you torrenting directly to the mount storage, or download locally then transfer? (there are special considerations needed for directly torrenting to mount if that's the case here).

If you can share the contents of your rclone.conf file (be careful to redact any password and crypt keys) then that would help analyze the problem.

Showing us the mount command you use would also greatly help.

Ultimately we may need a log using -vv or --log-level DEBUG (same thing) to see more info, but let's start the the other questions first and see where that gets us :slight_smile:

No, I restarted the Pi and deleted the source folder to avoid any more problems.

Whatever it was, it was very slow.

100Mbps.

Download locally then transfer. In this case, After the torrent finished downloading, I attempted to manually move the folder to the mount.

Now, I know that the files are downloaded on to the memory card of the Raspberry Pi, which is slow with limited IOPS. I bought the best one I could.

When I attempted to move the files, am I correct in assuming that the file was actually read and written at the same time to the same disk? Does Rclone upload the file after the file has been fully moved to the mount?

Is there a way for Rclone to monitor a folder and automatically upload it when files are present in it, without copying it to the mount?

I downloaded the same torrent on my Digital Ocean VM and here's the log:

media.log
2019/10/11 11:23:04 NOTICE: Serving remote control on http://127.0.0.1:5572/
2019/10/13 06:54:50 INFO  : Movies/radarr/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/RARBG.txt: Copied (new)
2019/10/13 06:54:52 INFO  : Movies/radarr/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/RARBG_DO_NOT_MIRROR.exe: Copied (new)
2019/10/13 06:54:56 INFO  : Movies/radarr/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/Subs/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS.idx: Copied (new)
2019/10/13 06:55:00 INFO  : Movies/radarr/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/Subs/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS.sub: Copied (new)
2019/10/13 06:55:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 145.209M (was 0)
2019/10/13 06:56:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 1.843G (was 145.209M)
2019/10/13 06:57:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 3.526G (was 1.843G)
2019/10/13 06:58:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 5.221G (was 3.526G)
2019/10/13 06:59:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 6.775G (was 5.221G)
2019/10/13 07:00:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 8.468G (was 6.775G)
2019/10/13 07:01:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 8.756G (was 8.468G)
2019/10/13 07:02:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 8.756G (was 8.756G)
2019/10/13 07:03:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 8.756G (was 8.756G)
2019/10/13 07:04:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 8.756G (was 8.756G)
2019/10/13 07:05:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 8.756G (was 8.756G)
2019/10/13 07:06:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 8.756G (was 8.756G)
2019/10/13 07:07:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 8.756G (was 8.756G)
2019/10/13 07:08:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 8.756G (was 8.756G)
2019/10/13 07:09:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 8.756G (was 8.756G)
2019/10/13 07:10:04 INFO  : Cleaned the cache: objects 10 (was 10), total size 8.756G (was 8.756G)
2019/10/13 07:10:37 INFO  : Movies/radarr/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS.mkv: Copied (new)
2019/10/13 07:10:42 INFO  : Movies/radarr/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/the.lion.king.2019.rerip.1080p.bluray.x264-sparks.jpg: Copied (new)
2019/10/13 07:10:45 INFO  : Movies/radarr/The.Lion.King.2019.RERiP.1080p.BluRay.x264-SPARKS[rarbg]/the.lion.king.2019.rerip.1080p.bluray.x264-sparks.nfo: Copied (new)
2019/10/13 07:11:04 INFO  : Cleaned the cache: objects 12 (was 12), total size 8.756G (was 8.756G)
2019/10/13 07:12:04 INFO  : Cleaned the cache: objects 12 (was 12), total size 8.756G (was 8.756G)
2019/10/13 07:13:04 INFO  : Cleaned the cache: objects 12 (was 12), total size 8.756G (was 8.756G)
2019/10/13 07:14:04 INFO  : Cleaned the cache: objects 12 (was 12), total size 8.756G (was 8.756G)
2019/10/13 07:15:04 INFO  : Cleaned the cache: objects 12 (was 12), total size 8.756G (was 8.756G)

Not sure why much of the bottom lines were the same.

Please tell me what cloud-provider you use (for example Google Drive), so I can suggest rclone optimizations on uploading.

Ugh oh, yes this might be a significant part of the problem, because if you use a write-cache on the mount then you are indeed doing read-->write-->read all on the same disk. On a limited IOPS disk like an SDcard this can cripple the speed. On a harddrive it would be far less of an issue.

Well, there is no automatic function for this, but we can very easily script it (I can help you with that I imagine). We can just run a bash (this is Linux based right?) script on a recurring timer in cron for example - to check a certain folder every X minutes and upload any files there. Because we don't have to deal with the limitations of the mount in this manner we can bypass the IOPS problem of the write-cache. I would consider this the likely best solution fro your setup. Upload stuff this way and use the mount as a convenient way to read or stream files (as this is in most cases still fairly efficient).

Does this sounds like a plan?

Somewhat unrelated note on Torrenting directly to a cloud-drive:
It is possible to do, but it almost requires yo use a torrent client that has a "temporary folder" feature, where it (locally) downloads the files - then when the torrent is done it moves the torrent out to the permanent location (and the permanent location can be the uploading folder monitored by the script).
So this is technically still download-locally-first approach, but it can be fully automated and thus very convenient :slight_smile:

Be a little careful about downloading tons of data regularly to an SDcard though. They are decent these days, but they still don't have ulimited wear-ratings. I'd definitely research what your model is actually rated to be able to write through it's life before it finally fails. "abusing" an SDcard certainly can lead to it wearing out. Just be aware of this, and if you are using the system heavily a small HDD or decent SSD would be a more appropriate tem-storage location. Or even some network location on another machine perhaps.

I can't recommend uploading directly via the mount. It's best to use either command line or, and this is what I do, use RcloneBrowser to make things nice and easy.

1 Like

For large transfers it is more efficient yes (mostly from bypassing the need for the write cache). Usually it's "fine" for just small stuff for the sake of convenience, but in this users case where is literally on an SDcard we should definitely use an upload script for the job instead than mount.

Google Drive, yes.

media.service
[Unit]
Description=GDrive Media Mount     
Wants=network-online.target
After=network-online.target

[Service]
Type=notify
ExecStart=/usr/bin/rclone mount \
        --config=/home/pi/.config/rclone/rclone.conf \
        --allow-other \
        --allow-non-empty \
        --dir-cache-time 48h \
        --vfs-cache-mode writes \
        --log-level INFO \
        --log-file /opt/media.log \
        --rc \
        media:/ /mnt/media
Restart=always
RestartSec=10
ExecStop=/bin/fusermount -u /mnt/media
User=pi
Group=pi

[Install]
WantedBy=multi-user.target

I do have spare hard drives connected that don't do anything because I've moved all the data to Google Drive.

Yes, Ubuntu 18.04. That would be great.

I've been using qBittorrent all this while, now I'm planning to move to rTorrent with Flood being the frontend. Yet to know how to set a temp folder with rTorrent.

----------Fixing write cache botleneck------------------
Ok well then the first and easiest solution is to fix the bottleneck with the write-cache.
By default the cache-directory will be under the users %home% , which is probably your SDcard. Lets first of all change that so it's on a much faster HDD and also we don't burn out your SDcard with writes. Since you said you had a HDD on the system we can just use that:

--cache-dir "select/a/path/to/a/harddrive"
(modify the highlighted text obviously)

Now your writes to the mount should no longer be bottle-necked here, assuming your HDD is faster than your network that is - which is probably is. I also highly recommend you use your HDD for any torrent-downloading for much the same reasons. Higher speed + much higher endurance. HDDs basically never wear out on writes (instead they fail mechanically some day, which is not really related to writes).

------------Gdrive optimization settings-------------------
In your rclone.conf , set this line under your grive block (just put it last, it doesn't matter):
chunk_size = 64M

or alternatively...

Put this in your mount command:
--drive-chunk-size 64M

Please not these have slightly different formatting...
You don't neeed to use both.
This will increase your bandwidth utilization on upload specifically - but as much as 20-40% depending on other factors.
Please do note that this requires more memory. This much memory can be used by EACH active transfer, so using the default 4 transfers this would be 64M x 4 = 256M. If you have low memory on the system please adjust it as needed. Do not overload the memory or rclone will just crash...
There is some slight benefit to up to as much as 128M if you have loads of free memory, but the gains are marginal at that point.

------------------Automated Upload script-------------------
Animosity almost certainly has a script you can steal for Linux in his "recommended settings" thread because he shares all his scripts there and I know he uses an upload script.

But I will also provide a really simple example here as an alternative.
Hang on... working....

Something like this should work. Pretty basic, but I am not great at bash scripting yet, and it should work fine.

#!/bin/bash

#Lock the file so only one of these scripts can run at once.
lockfile -r 0 /tmp/uploadscript.lock || exit 1


##loop until we detect files in the upload folder, then perform rclone upload command and exit
for (( ; ; ))
do
if [ -z "$(ls -A /path/to/your/gdrive/mount)" ]; then
   echo "Empty" && sleep 900
else
   echo "Not Empty" && PUT YOUR FULL RCLONE UPLOAD COMMAND HERE && rm -f /tmp/uploadscript.lock && exit
fi
done

You will need to edit these things:

  • /path/to/your/gdrive/mount <---- This needs to point to your upload folder. We will be checking if this contains files, then uploading the files from there.
  • sleep 900 <<<--- How often to recheck the folder for files to upload, in seconds. (900 = 15 minutes). Adjust as you wish.
  • PUT YOUR FULL RCLONE UPLOAD COMMAND HERE <<----- self-explanatory hopefully :slight_smile:

!!! Please note I have not tested this script extensively, it may contain simple bugs and syntax errors - please do test before just assuming it works perfectly !!!

I assume you know what an upload command looks like, but here is a super basic example for the sake of completeness:
rclone move /path/to/harddrive/uploadfolder MyGdriveRemote:/Mystuff/

------------Last step-----------------
Set up a cron job to run this again and again on a timer. Say 30 minutes or whatever you want.
Once it starts it will sit and monitor until it triggers - and then exit. That's why we need to have cron re-launch it again some time later.
I've but in a basic file-lock on it to prevent multiple upload scripts from accumulating. If a script is already waiting and monitoring, cron launching it again will just cause it to fail and exit - so you don't end up with dozens of them :slight_smile:

2 Likes

Wow that's a lot, I'm going through it.

Don't get overwhelmed by my verbosity.
Most of it is just extra info. The most important bits here (usually highlighted) are not actually a lot to do - or very complex.

Take your time and digest it - then ask for clarification as needed :slight_smile:

I haven't said this before but torrents are mostly automatically added by sonarr or radarr and they have their own categories (so they download in the temp folders then get moved to their respective folders inside the mount).

For the time being, I will set those folders to the hard disk and have the torrent client move them to the mount... will monitor the logs for this.
Thanks for your time :slight_smile:

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.