Not really a problem, trying to automate some workflow, seeking for advice.
Run the command 'rclone version' and share the full output of the command.
rclone v1.57.0
os/version: ubuntu 20.04 (64 bit)
os/kernel: 5.11.0-1029-gcp (x86_64)
os/type: linux
os/arch: amd64
go/version: go1.17.2
go/linking: static
go/tags: none
Yes
Which cloud storage system are you using? (eg Google Drive)
Google Drive
The command you were trying to run (eg rclone copy /tmp remote:tmp)
rclone mount --daemon remote: local
cp local/1.zip ..
cd ..
unzip 1.zip
mv 1/ toBeEncrypted/.
rclone -q copy toBeEncrypted/ remoteEncryted:
The rclone config contents with secrets removed.
Paste config here
A log from the command with the -vv flag
Paste log here
First of all. Thank you all for making Rclone available.
I am trying to encrypt some of my files, but I need to download and unzip them first. While my computer has limited disk space, files can't be downloaded at once, wonder anyone who has experience with Rclone can share a few tips to achieve the same goal as the workflow above.
Thank you.
Thank you for the quick reply.
Can you please understand the loop from your command?
trying to copy all files from remote: personal ( where personal folder contains 1.zip, 2.zip, 3.zip, ....1000.zip)
then unzip them,
then move all unzip folders to remoteEncrypted:
problems facing: Can't download all the files to local since my computer don't have enough hard disk space.
rclone mount --daemon remote: local
rclone mount --daemon remoteEncrypted: encrypted
cd encrypted
for a in ../local/*
do
unzip $a
done
(I've never tried two rclone mount at the same time...)
You might want to ensure vfs caching is set low (or disabled) so it doesn't fill up your local disk; caching won't help much here, especially on reads.
mounting two points is ok.
The code you shared here seem to required to download all files locally, then unzip them one by one, then unzip to encryptedRemote.
But, my hard disk can't save all the files locally.
No it doesn't. It uses the "local" mount (so reads directly from the rermote: service) and writes to the encrypted (remoteEncrypted:) drive with no local storage requirements (beyond whatever caching mount does).
my point is still valid, there are critical features missing when not using --vfs-cache-mode=off
prevents me from trusting that. perhaps that is ok for you.
in addition to the three issues listed above, there is yet another critical feature missing
if there is an problem with writing,
--- rclone might not notice the problem.
--- even if rclone notices the problem, will not retry the upload.
so one way or another i would process one zip file at a time.
--- download the zip, unzip, rclone move files to remoteEncryted:, delete the zip
or
--- use the double mount,
for dest mount, to minimize the size of the vfs file cache, use rclone mount remoteDisk:re local --daemon --vfs-cache-mode=writes --vfs-cache-max-age=10s --vfs-cache-poll-interval=10s
" so one way or another i would process one zip file at a time.
--- download the zip, unzip, rclone move files to remoteEncryted:, delete the zip"
That is my goal as well.
but can't follow your logic/ point with your code in your vary first reply.
(I am vary new to rclone and coding)
If you have time, please share a more completed codes to provide your points. Thank you.
basically, i would use @sweh code, but for the dest mount, to use my command.
rclone mount remote: local --daemon
rclone mount remoteEncrypted: encrypted --daemon --vfs-cache-mode=writes --vfs-cache-max-age=10s --vfs-cache-poll-interval=10s
cd encrypted
for a in ../local/*
do
unzip $a
done
understand, we can't be cheap sometimes (the loop is going through a lot of zip files) Wonder any solution in this particular situation. Thank you in advance.
as per the docs,
--- make sure you created your own gdrive client id.
--- gdrive can be slow when there are a lot of small files.
so if that is the problem, no good solution.
after a file has been unzipped into the local vfs file cache:
--- after 5 seconds, the upload will start.
--- once the upload has completed, the file will be purged after 10 seconds.
so if you stop processing the zip files, the cache should shrink in size quickly.
make sure that is the case.