Best way to compress files on remote

I'm using rclone for backup purposes. Due to a misconfiguration on my side, I uploaded many multi-gigabyte files without compressing them first (these are highly compressible - normally I gzip them before upload). As a result, I quickly exhausted my storage quota.

Basically I need to download, compress and re-upload these files since my remote doesn't offer a way to do this server-side. Downloading them all is not possible since I don't have the disk space, so I have to process these one by one.

Is there any way other than writing a script that loops rclone lsf output, runs download-compress- reupload cycle, preferably using with no disk space? (this solution needs disk space for downloaded + compressed file)

rclone v1.45
- os/arch: linux/386 (CentOS 6)
- go version: go1.11.2
- remote: Google Drive


hello and welcome to the forum,

yes, you can write a script and use little to no disk space.

when you posted, you should have been asked some questions.
can you answer them?

I forgot to put my rclone information, sorry. It is on the first message now.

Meanwhile I found about rclone cat and rcat and in a rclone lsf loop, rclone cat | gzip | rclone rcat uses no disk space if I understand correctly.

Is there any other way or this is the way to go?

yes, that could work

That version is somewhat old so I'd update it.

You can mount it and run commands on the mount.

Testing with some small files worked but larger files might not. That would be easier if it works for those larger files.

the most reliable option is to download one file at a time, compress it and upload it.
that would not use much disk space.

if you are really low on disk sapce,
depending on the file size is to use a ram disk.
i often do that.

i know that other users might create a google compute virtual machine and run rclone on that.
the vm costs little to nothing.

That will use the least disk space and is probably what I'd do.

None of the backends can compress a file in-place so you'll have to download, compress and re-upload.

1 Like

Thanks for all replies. I proceeded with cat | rcat route. I'll leave it here if anyone needs it in the future.

# gzips all *.dpdmp files on the remote
rclone lsf remote: --files-only --include *.dpdmp | while IFS= read -r line ; do
   echo "$line"
   rclone cat remote:$line | gzip | rclone rcat remote:$line.gz -P
   echo "===="

nice solution,

you can create a wiki post for it
or with your permission, i can create a wiki post for you.
you can comment on it and i can tweak it as needed.

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.