I have been using rclone to backup google drive data to AWS S3 cloud storage. I have multiple google drive accounts who's backup happens on AWS S3. All those google drive having multiple number of documents.
I want to compress those documents in single zip file and then it need to be copy on S3.
Is there any way to achieve the same?
I have refer below link. But it doesn't have complete steps to accomplish the task.
Any suggestion would be appreciated.
sorry, i know nothing about the compress remotes,
it is experimental and imho, should not be used for backups.
one option is to rent a free/cheap virtual machine, from, aws/google/etc..., and run rclone on that.
can run 7zip or some other compression tool.
We are running the rlone on virtual machine only which is hosted on AWS cloud. From that machine we are syncing Google drive data to AWS Cloud storage.
Is there any link if you could share that how to run 7zip or compression tool on virtual machine having rlcone?
there are a number of topics you can search about using
rclone rcat+ tar
for example, https://forum.rclone.org/t/rclone-copy-entire-drive/25155/8
i have a python script that will 7zip some local files and
rclone copy to cloud.
basically, run 7zip on a set of files and then
rclone copy that .7z to cloud.
I have full access of google drive using rclone. Can't I zip the files in google drive itself using some rclone commands? Then sync it to S3 bucket
not that i know of, as the rclone compression remote is experimental status.
Even if the compression remote weren't experimental, it wouldn't work. Compression != zip. Zip is an archive container format that can optionally (but often) compress whereas the compression backend does file-by-file compression (I suspect you know this but for others' benefit)
@KingsPP, there are no rclone commands to zip the contents on a remote. Google Drive may offer it on their UI but I don't know. And given how rclone works on a file-, as opposed to block-, basis, there is no easy way for built a zip-file directly. I think your best bet is to do it locally and then update.
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.