What is the problem you are having with rclone?
Rclone is eating up my local SSD, even though I'm drag and dropping directly from my External HDD to Rclone Drive Mount.
What is your rclone version (output from rclone version
)
Rclone version 1.55.1
Which OS you are using and how many bits (eg Windows 7, 64 bit)
Raspberry Pi OS Desktop
Which cloud storage system are you using? (eg Google Drive)
Google Drive
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
This is how my rclone.service looks for mounting Google Drive.
[Unit]
Description=RClone Service
Wants=network-online.target
After=network-online.target
[Service]
Type=notify
Environment=RCLONE_CONFIG=/opt/rclone/rclone.conf
KillMode=none
RestartSec=5
ExecStart=/usr/bin/rclone mount gdrive: /home/pi/GDD \
# This is for allowing users other than the user runni$
--allow-other \
# Google Drive is a polling remote so this value can b$
--dir-cache-time 1000h \
# To log to file and where as well
--log-file /home/pi/rclonelog.log \
# I reduce the poll internval down to 15 seconds as th$
--poll-interval 15s \
# This is setting the file permission on the mount to $
--umask 002 \
# Please set this to your own value below
--user-agent randomappname102 \
# This sets up the remote control daemon so you can is$
--rc \
# This is the default port it runs on
--rc-addr :5572 \
# no-auth is used as no one else uses my server and it$
--rc-no-auth \
# The local disk used for caching
--cache-dir=/cache \
# This is used for caching files to local disk for str$
--vfs-cache-mode full \
# This limits the cache size to the value below
--vfs-cache-max-size 300G \
# This adds a little buffer for read ahead
--vfs-read-ahead 256M \
# This limits the age in the cache if the size is reac$
--vfs-cache-max-age 1000h \
# This sets a per file bandwidth control and I limit t$
--bwlimit-file 16M
ExecStop=/bin/fusermount -uz /GD
ExecStartPost=/usr/bin/rclone rc vfs/refresh recursive$
Restart=on-failure
User=root
Group=root
[Install]
WantedBy=multi-user.target
The rclone config contents with secrets removed.
[gdrive]
type = drive
client_id = xxxxxxxxxx.apps.googleusercontent.com
client_secret = xxxxxxxxx
scope = drive
token = {"access_token":"xxxxxxx","token_type":"Bearer","refresh_token":"xxxxxxx","expiry":"2021-02-01T01:35:09.192307807+02:00"}
team_drive = XXXXXXX
root_folder_id =
Comments
Noob alert, VERY first time working with Linux, Rclone etc. I had an old laptop laying around with an 500gb SSD and i5, that I wanted to use for Plex Media Server. Installed a completely fresh version of Raspberry Pi OS Desktop on it and installed Rclone to mount my Google Drive locally, so I can use that for my Plex Library. I found some commands online for mounting google drive and using it. for Plex from @animosity22 / homescripts. After mounting Google Drive using the rclone.service, I plugged in my external HDD, and dragged and dropped everything from my external HDD to the GDD(Google Drive mount), it's around 1.8TB Data. It started transferring, but after many hours, I got an error (Input/output error), and then I can see that my laptop 500GB SSD is completely filled up, and no available space on it. I double checked and I can see that some of the files have been transferred to Google Drive, but for some hours now, nothings happened. And my SSD is still filled up. What am I doing wrong?
Thanks in advance.