Starr apps to GDrive

What is the problem you are having with rclone?

The problem isn't with rclone perse. I want to make sure the mount command is appropriate for this use case. A box running Synology (xpenology) hosts the starr apps in docker. Files are downloaded to spinning disks then uploaded to GDrive. I've been told it's not wise to write directly to the mount, rather use move in conjunction with something like mergerfs. However this mount command has been working fine. Is this mount command correct for just uploading to the GDriver mount?

Run the command 'rclone version' and share the full output of the command.

rclone v1.62.2

  • os/version: unknown
  • os/kernel: 4.4.180+ (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.20.2
  • go/linking: static
  • go/tags: none

Which cloud storage system are you using? (eg Google Drive)

GDrive Enterprise

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone mount gdrive: /volume1/data/media/gdrive  --config=/var/services/homes/bulfinch/.config/rclone/rclone.conf --cache-dir=/volume1/data/.cache --log-file=/volume1/docker/rclone/logs/rclone.log --allow-other --log-level=NOTICE --dir-cache-time=48h --cache-info-age=48h --buffer-size=128M  --poll-interval=10s --umask=002 --drive-pacer-min-sleep=10ms --drive-pacer-burst=1000 --vfs-cache-mode=full --vfs-cache-max-age=6h --vfs-cache-poll-interval=5m --rc --rc-web-gui --rc-addr --rc-user admin --rc-pass admin --rc-web-gui-no-open-browser

The rclone config contents with secrets removed.

type = drive
client_id = 83630
client_secret = 6
scope = drive
token = {"access_token"
team_drive = 0A
root_folder_id = 
service_account_file = 

A log from the command with the -vv flag

Paste  log here

It's fine to write to the mount so sounds like some odd advice.

To make sure i'm understanding correctly, I was pointed to this site with the relevant recommendations being.


  • Don't download into your Gdrive - Download to a local disk and move the data later with Rclone
  • Don't import to your Gdrive - Set up a merged local cache disk and move the data later with Rclone
  • Do all large writes locally
  • Move to cloud on a schedule - Easily scriptable
  • Absolutely do not write (large files) directly to the rclone mount

I'll add that an issue I'm having is that the starr apps lock the sql database during its processes, files get uploaded and somehow lost to sonarr/radarr leaving an error with something like "no files available to import". I then have to manually tell the app that in fact the file is on the remote. So the suggestion was that my mount was not the best way to do it.

That all looks dated and wrong.

You’d want to recreate the issue and share a debug log.

Ok thanks. That's what I thought.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.