Rclone eating up local SSD storage

What is the problem you are having with rclone?

Rclone is eating up my local SSD, even though I'm drag and dropping directly from my External HDD to Rclone Drive Mount.

What is your rclone version (output from rclone version)

Rclone version 1.55.1

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Raspberry Pi OS Desktop

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

This is how my rclone.service looks for mounting Google Drive.

Description=RClone Service

ExecStart=/usr/bin/rclone mount gdrive: /home/pi/GDD \
# This is for allowing users other than the user runni$
--allow-other \
# Google Drive is a polling remote so this value can b$
--dir-cache-time 1000h \
# To log to file and where as well
--log-file /home/pi/rclonelog.log \
# I reduce the poll internval down to 15 seconds as th$
--poll-interval 15s \
# This is setting the file permission on the mount to $
--umask 002 \
# Please set this to your own value below
--user-agent randomappname102 \
# This sets up the remote control daemon so you can is$
--rc \
# This is the default port it runs on
--rc-addr :5572 \
# no-auth is used as no one else uses my server and it$
--rc-no-auth \
# The local disk used for caching
--cache-dir=/cache \
# This is used for caching files to local disk for str$
--vfs-cache-mode full \
# This limits the cache size to the value below
--vfs-cache-max-size 300G \
# This adds a little buffer for read ahead
--vfs-read-ahead 256M \
# This limits the age in the cache if the size is reac$
--vfs-cache-max-age 1000h \
# This sets a per file bandwidth control and I limit t$
--bwlimit-file 16M
ExecStop=/bin/fusermount -uz /GD
ExecStartPost=/usr/bin/rclone rc vfs/refresh recursive$


The rclone config contents with secrets removed.

type = drive
client_id = xxxxxxxxxx.apps.googleusercontent.com
client_secret = xxxxxxxxx
scope = drive
token = {"access_token":"xxxxxxx","token_type":"Bearer","refresh_token":"xxxxxxx","expiry":"2021-02-01T01:35:09.192307807+02:00"}
team_drive = XXXXXXX
root_folder_id = 


Noob alert, VERY first time working with Linux, Rclone etc. I had an old laptop laying around with an 500gb SSD and i5, that I wanted to use for Plex Media Server. Installed a completely fresh version of Raspberry Pi OS Desktop on it and installed Rclone to mount my Google Drive locally, so I can use that for my Plex Library. I found some commands online for mounting google drive and using it. for Plex from @animosity22 / homescripts. After mounting Google Drive using the rclone.service, I plugged in my external HDD, and dragged and dropped everything from my external HDD to the GDD(Google Drive mount), it's around 1.8TB Data. It started transferring, but after many hours, I got an error (Input/output error), and then I can see that my laptop 500GB SSD is completely filled up, and no available space on it. I double checked and I can see that some of the files have been transferred to Google Drive, but for some hours now, nothings happened. And my SSD is still filled up. What am I doing wrong?

Thanks in advance.

hello and welcome to the forum,

if you drag/drop a file into /home/pi/GDD, rclone will copy the file into /cache and then uploaded to the cloud.

rclone copy will copy from local to cloud, and not fill up the ssd.
if you need a gui, then use rclone browser to upload to cloud.

1 Like

Thank you so much for your quick answer.

Is it safe to manually delete whatever is in /cache and the run the rclone copy command? Or how should I do it?

i would

  • kill the rclone mount
  • delete the files inside /cache
  • use rclone copy or use this

So sorry for noob questions but can I run these commands:

  1. Kill the rclone mount = systemctl stop rclone.service
  2. Delete files in /cache = use the file browser to just go into /cache and delete everything in there
  3. Copy = run the command rclone copy /media/pi/HDD:AllMovies /home/pi/GDD:AllMovies

Is this correct?

  1. yes
  2. yes
  3. rclone copy /media/pi/HDD:AllMovies GDD:AllMovies --dry-run --progress
    best to test with --dry-run and read the output. if it looks good, then remove --dry-run

Killed the rclone mount succesfull, and delete all files in /cache. But when running the command rclone copy /media/pi/HDD/AllMovies gdrive --dry-run --progress its showing progress and process is done within seconds and then it says "file xxxx: Skipped Copy as --dry-run is set (size xx.xxM). Am I doing something wrong?

take a few minutes and read the docs.
Documentation and this

and your command will copy files to a local folder named gdrive,
a remote needs a : colon character, otherwise rclone thinks you want to copy to local folder

Just understood it more! Thank you so much for your help!

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.