Hello all Ive been messing around with RClone on my own and in my research I have found many scripts and options for what people claim are the best options for intigrating Rclone/GoogleDrive/Plex. I has even seen a few scripts stating that the rclone mount option is much slower than the rclone copy/move options.
With all that said I figured I would come over to the place the application was made to try and solve this once and for all.
What is the best method for integrating Rclone for both screaming with plex as well as uploading downloads to Google Drive?
Here are the scripts Im currently running.
crontab -e
#load googledrive
@reboot rclone mount gcrypt: ~/mnt/gdrive --allow-other --allow-non-empty --cache-db-purge --buffer-size 256M --dir-cache-time 72h --drive-chunk-size 32M --timeout 1h --vf$
#start radarr sonarr nzbget
@reboot bash /media/start.py
@reboot mono /opt/Radarr/Radarr.exe
@reboot mono /opt/NzbDrone/NzbDrone.exe
I noticed my connection for uploading was not good on the mount (about 9-10MB/s) so I went and found a script to run once radarr and sonarr were done with the renaming of files as shown below.
#!/bin/sh
echo "Uploading to GDrive"
rclone move /root/downloads/completed/TVShows gdrive:downloads/complete/TVShows/ --drive-chunk-size=128M --checkers 10 --transfers 8 --tpslimit 10 --max-transfer 750G --fast-list
exit
This gets me to about 70MB/s upload.
I feel there must be a better way to do this as the triggering on Sonarr is not the best compared to Radarr for post processing. and when these do not run as intended(which can happen when theres a very large library that its looking though). The issue with doing it this way is that If I scan for missing content it will look at the folder path that I gave it for the renamed which is then moved to Gdrive and content and will be pulled fir download again.
Is there a better way of doing this?