What is the problem you are having with rclone?
Most of my media files are not playing and am getting the below error while checking out rclone logs.
vfs cache: failed to download: vfs reader: failed to write to cache file: open file failed: googleapi: Error 403: The download quota for this file has been exceeded., downloadQuotaExceeded
Run the command 'rclone version' and share the full output of the command.
rclone v1.62.2
- os/version: ubuntu 22.04 (64 bit)
- os/kernel: 5.19.0-38-generic (x86_64)
- os/type: linux
- os/arch: amd64
- go/version: go1.20.2
- go/linking: static
- go/tags: none
Are you on the latest version of rclone? You can validate by checking the version listed here: Rclone downloads
-->Yes
Which cloud storage system are you using? (eg Google Drive)
-->Google Drive
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
[Unit]
Description=Rclone VFS Mount
After=network-online.target
[Service]
User=azra3l
Group=azra3l
Type=notify
ExecStartPre=/bin/sleep 10
ExecStart=/usr/bin/rclone mount \
--user-agent='Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.131 Safari/537.36' \
--config=/home/azra3l/.config/rclone/rclone.conf \
--allow-other \
--rc \
--rc-no-auth \
--rc-addr=localhost:5572 \
--drive-skip-gdocs \
--vfs-cache-mode=full \
--vfs-read-chunk-size=128M \
--vfs-cache-max-age=48h \
--vfs-cache-max-size=64G \
--vfs-read-chunk-size-limit=2048M \
--buffer-size=64M \
--poll-interval=15s \
--dir-cache-time=1000h \
--cache-dir=/mnt/secondary/cache \
--timeout=10m \
--drive-chunk-size=64M \
--drive-pacer-min-sleep=10ms \
--drive-pacer-burst=1000 \
--umask=002 \
--log-level INFO \
--log-file=/mnt/secondary/logs/rclone.log \
-v \
google: /mnt/remote
ExecStartPost=/usr/bin/rclone rc vfs/refresh recursive=true --url http://localhost:5572 _async=true
ExecStop=/bin/fusermount -uz /mnt/remote
Restart=on-abort
RestartSec=5
StartLimitInterval=60s
StartLimitBurst=3
The rclone config contents with secrets removed.
[TV]
type = drive
scope = drive
service_account_file = /opt/sa/all/26.json
team_drive =
[Movies]
type = drive
scope = drive
service_account_file = /opt/sa/all/120.json
team_drive =
[Music]
type = drive
team_drive =
scope = drive
service_account_file = /opt/sa/all/220.json
Movies4K]
type = drive
scope = drive
service_account_file = /opt/sa/all/27.json
team_drive =
TV4K]
type = drive
service_account_file = /opt/sa/all/121.json
team_drive =
scope = drive
[Books]
type = drive
scope = drive
service_account_file = /opt/sa/all/221.json
team_drive =
A log from the command with the -vv
flag
https://privatebin.tardisonline.in/?89fbfc64fff44b11#Eh8Smw6bwdJthC6JCGLj5y3K79bwFZnFXtwoFeuVopwj
I use SARotate to rotate between 300 json files to avoid the 700gb upload and download issue.