What is the problem you are having with rclone?
I need to upload around 5TB worth data from disk onto Object Storage. To do this, I'm compressing the data first into a tar file and then uploading with rclone copy command.
Can you pls suggest if the same compression can be done by rclone copy in flight to avoid the double processing times and steps. One to tar and one for rclone copy for the tarred file. The data mostly consists of gz files already, so I'd like to stick to only creating a tar file. tar.gz is not giving any further compression benefit.
What is your rclone version (output from
# rclone version rclone v1.53.1 - os/arch: linux/amd64 - go version: go1.15
Which OS you are using and how many bits (eg Windows 7, 64 bit)
Oracle Linux 7.8 64 bit
Which cloud storage system are you using? (eg Google Drive)
Oracle Cloud Infrastructure Object Storage
The command you were trying to run (eg
rclone copy /tmp remote:tmp)
rclone copy /tmp/custom-test.tar <rclone-profile>:<object-storage-bucket>
The rclone config contents with secrets removed.
[rclone-profile] type = s3 provider = IBMCOS access_key_id = <redacted> secret_access_key = <redacted> region = us-ashburn-1 endpoint = <redacted_namespace>.compat.objectstorage.us-ashburn-1.oraclecloud.com location_constraint = us-ashburn-1 server_side_encryption = AES256 force_path_style = true
A log from the command with the