It almost works, but it only uploads 2.3MB to the cloud and not 2GB
Do you have an idea maybe why it is like that?
I also tried -tzip and it's the same
I tried to run the exact same command as yours only with a 2GB file
It almost works, but it only uploads 2.3MB to the cloud and not 2GB
Do you have an idea maybe why it is like that?
I also tried -tzip and it's the same
I tried to run the exact same command as yours only with a 2GB file
if you can, copy/paste the text from terminal. hard to read those tiny screen snippets.
Sure:
Could it be a limitation of the rcat command to a certain size of a file? If so, is there an alternative? I tried to use copy but it didn't work because it needs a source and a destination
By the way, thank you very much for your help
PS C:\Users\T\Desktop> 7z a dummy -tgzip -so ./myfile.exe | rclone rcat Drive:zork/zork.gz -vv
2024/01/26 04:07:03 DEBUG : rclone: Version "v1.65.1" starting with parameters ["C:\\Users\\T\\Desktop\\rclone\\rclone-v1.65.1-windows-amd64\\rclone.exe" "rcat" "Drive:zork/zork.gz" "-vv"]
2024/01/26 04:07:03 DEBUG : Creating backend with remote "Drive:zork/"
2024/01/26 04:07:03 DEBUG : Using config file from "C:\\Users\\T\\AppData\\Roaming\\rclone\\rclone.conf"
2024/01/26 04:07:03 DEBUG : Google drive root 'zork': 'root_folder_id = 0AIB4tioEYEbDUk9PVA' - save this in the config to speed up startup
2024/01/26 04:07:04 DEBUG : fs cache: renaming cache item "Drive:zork/" to be canonical "Drive:zork"
2024/01/26 04:07:06 DEBUG : zork.gz: Sending chunk 0 length 2346942
2024/01/26 04:07:08 DEBUG : zork.gz: md5 = 6de16f61ddc5fce6eca4bff104574682 OK
2024/01/26 04:07:08 DEBUG : zork.gz: Size and md5 of src and dst objects identical
2024/01/26 04:07:08 DEBUG : 6 go routines active
PS C:\Users\T\Desktop> ls ./myfile.exe
Directory: C:\Users\T\Desktop
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 1/26/2024 1:21 AM 2000048576 myfile.exe
welcome
i just uploaded 2257770622 bytes
DEBUG : rclone: Version "v1.65.0" starting with parameters ["c:\\data\\rclone\\rclone.exe" "rcat" "wasabi01:zork/zork.gz" "--streaming-upload-cutoff=3G" "--multi-thread-streams=0" "--s3-chunk-size=256M" "--log-file=log.txt" "--log-level=DEBUG"]
DEBUG : Creating backend with remote "wasabi01:zork/"
DEBUG : Using config file from "c:\\data\\rclone\\rclone.conf"
DEBUG : wasabi01: detected overridden config - adding "{A6J6b}" suffix to name
DEBUG : Resolving service "s3" region "us-east-2"
DEBUG : fs cache: renaming cache item "wasabi01:zork/" to be canonical "wasabi01{A6J6b}:zork"
DEBUG : S3 bucket zork: File to upload is small (2257770622 bytes), uploading instead of streaming
INFO : S3 bucket zork: Bucket "zork" created with ACL ""
DEBUG : zork.gz: open chunk writer: started multipart upload: AnoDpk258ByNtHIbrRjegM-yBNIEkRqb0hmA0QmtKkWF1mka8-HhFpjfkdsFY8WWG3Sz2nI9pnsFN1Bn2cj1Ytu9ag523VpOx_rjWfJMbXtvfZECEFHzEA5IlMxHbFSy
DEBUG : zork.gz: multipart upload: starting chunk 0 size 256Mi offset 0/2.103Gi
DEBUG : zork.gz: multipart upload: starting chunk 1 size 256Mi offset 256Mi/2.103Gi
DEBUG : zork.gz: multipart upload: starting chunk 2 size 256Mi offset 512Mi/2.103Gi
DEBUG : zork.gz: multipart upload: starting chunk 3 size 256Mi offset 768Mi/2.103Gi
DEBUG : zork.gz: multipart upload wrote chunk 4 with 268435456 bytes and etag "a8bc62312455e797ac44c799d353b0b7"
DEBUG : zork.gz: multipart upload: starting chunk 4 size 256Mi offset 1Gi/2.103Gi
DEBUG : zork.gz: multipart upload wrote chunk 1 with 268435456 bytes and etag "fa40c4302e9a9bf00a0d779e398af599"
DEBUG : zork.gz: multipart upload: starting chunk 5 size 256Mi offset 1.250Gi/2.103Gi
DEBUG : zork.gz: multipart upload wrote chunk 3 with 268435456 bytes and etag "b489f98f07c0f93d2c7a8782dd3f0552"
DEBUG : zork.gz: multipart upload: starting chunk 6 size 256Mi offset 1.500Gi/2.103Gi
DEBUG : zork.gz: multipart upload wrote chunk 2 with 268435456 bytes and etag "8adeca1fd83db1799e3722396a76f633"
DEBUG : zork.gz: multipart upload: starting chunk 7 size 256Mi offset 1.750Gi/2.103Gi
DEBUG : zork.gz: multipart upload wrote chunk 6 with 268435456 bytes and etag "cc5cb099da90eb7de063ffe1c806ac20"
DEBUG : zork.gz: multipart upload: starting chunk 8 size 105.178Mi offset 2Gi/2.103Gi
DEBUG : zork.gz: multipart upload wrote chunk 5 with 268435456 bytes and etag "0b793b754da8b00fb0b88cc02a20b4bd"
DEBUG : zork.gz: multipart upload wrote chunk 7 with 268435456 bytes and etag "e97225e0a42ac8b31ffbd843e48b1225"
DEBUG : zork.gz: multipart upload wrote chunk 8 with 268435456 bytes and etag "acab48cf51a7e55ee8262e89e819d1e1"
DEBUG : zork.gz: multipart upload wrote chunk 9 with 110286974 bytes and etag "f0da0826bf494c03a053c33d22f6aff6"
DEBUG : zork.gz: multipart upload "AnoDpk258ByNtHIbrRjegM-yBNIEkRqb0hmA0QmtKkWF1mka8-HhFpjfkdsFY8WWG3Sz2nI9pnsFN1Bn2cj1Ytu9ag523VpOx_rjWfJMbXtvfZECEFHzEA5IlMxHbFSy" finished
DEBUG : zork.gz: Multipart upload Etag: a6ea7b9044572bede4902b0a7b7734f9-9 OK
DEBUG : zork.gz: md5 = 8f17466f02ee82ae93f5b216c8bd43b1 OK
INFO : zork.gz: Copied (new)
Bro thanks a lot it works like a charm.
The problem was that for the experiment I used a dummy file created by fsutil.exe to test the script before running, for an unknown reason when you compress a file created by fsutil.exe it becomes really small even if it is 2GB
After I ran the command on real files everything worked fine.
Thank you very much!
I'm leaving this issue open in hopes of updating if the files did upload properly to the remote as expected.
Well done.
A little tip: if someone uses it, just use CMD, Powershell doesn't work with it.
EDIT 1: The only downside: you can't add a password to archives where -so supports
EDIT 2: I was able to put a -p with -tzip even if it's not possible as described at the docs of 7z
7z a dummy -tzip -so "C:\Users\T\Desktop\files" -p123
I tested with a 31GB file then compared it against the original file on my computer with SHA256
The SHA256 results are the same
C0250(0) - Original file.
C0250.MP4 - File from remote.
C:\Users\T>Certutil -hashfile C:\Users\T\Desktop\C0250(0).MP4 SHA256
SHA256 hash of C:\Users\T\Desktop\C0250(0).MP4:
d7bb3772ebb9b75248fe539da31d016fc7c0c5138d8d4dd681da5338035ea66d
CertUtil: -hashfile command completed successfully.
C:\Users\T>
C:\Users\T>Certutil -hashfile C:\Users\T\Desktop\C0250.MP4 SHA256
SHA256 hash of C:\Users\T\Desktop\C0250.MP4:
d7bb3772ebb9b75248fe539da31d016fc7c0c5138d8d4dd681da5338035ea66d
CertUtil: -hashfile command completed successfully.
Cheers, and thanks for the help.
Do you think there is a way I can use this method for updating files as well?
I tried something like
7z u ur1 "S:\zork\m.na.gz" -tzip -so "D:\try" -p123 | rclone rcat Gencrypt:zork/m.na.gz -P -vv
and also other way:
7z u ur1 -tzip -so "D:\try" -p123 | rclone rcat Gencrypt:zork/m.na.gz -P -vv
But it didn't work, it upload it all up again.
If the remote file already exists, it will be overwritten.
to work around that, at least two ways.
I'm not sure if I understand, will this mean I can update new files in the archive using this way?
If the command I wrote will not work for an update in the way I presented, can you give an example that will work for an update?
Thanks!
once the archive is in the cloud, cannot update it.
This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.