I tried to use the SEE-C encryption when using an iDrive e2 bucket with no luck: when passing the credentials from the config file I always get an AccessDenied, status code 403 error. If I don't pass the credentials, then the transfers fail with md5 hash differ errors.
Run the command 'rclone version' and share the full output of the command.
rclone v1.59.2
os/version: darwin 12.6 (64 bit)
os/kernel: 21.6.0 (arm64)
os/type: darwin
os/arch: arm64
go/version: go1.18.6
go/linking: dynamic
go/tags: cmount
Which cloud storage system are you using? (eg Google Drive)
iDrive e2 (S3 compatible)
The command you were trying to run (eg rclone copy /tmp remote:tmp)
2022/10/09 13:03:29 DEBUG : --max-age 99y to 1923-11-03 11:03:29.593123 -0700 MST m=-3122063999.985728707
2022/10/09 13:03:29 DEBUG : rclone: Version "v1.59.2" starting with parameters ["./rclone-arm64" "sync" ".." "e2-ba-despues:/ba-despues-2022/" "--config" "./rclone.config" "-P" "--fast-list" "--max-age" "99y" "--exclude-from" "./exclusiones.txt" "--checkers=24" "--stats=1s" "--log-file=/Users/rodrigo/Dropbox/trabajo/Brujazul/2022/Despues/99_DES_UTILERIAS_NUBE/logs/backup_completo-2022-10-09T130329.log" "-vv" "--ask-password=false"]
2022/10/09 13:03:29 DEBUG : Creating backend with remote ".."
2022/10/09 13:03:29 DEBUG : Using config file from "/Users/rodrigo/Dropbox/trabajo/Brujazul/2022/Despues/99_DES_UTILERIAS_NUBE/rclone.config"
2022/10/09 13:03:29 DEBUG : fs cache: renaming cache item ".." to be canonical "/Users/rodrigo/Dropbox/trabajo/Brujazul/2022/Despues"
2022/10/09 13:03:29 DEBUG : Creating backend with remote "e2-ba-despues:/ba-despues-2022/"
2022/10/09 13:03:29 DEBUG : fs cache: renaming cache item "e2-ba-despues:/ba-despues-2022/" to be canonical "e2-ba-despues:ba-despues-2022"
2022/10/09 13:03:29 DEBUG : .DS_Store: Excluded
2022/10/09 13:03:29 DEBUG : 99_DES_UTILERIAS_NUBE: Excluded
2022/10/09 13:03:29 DEBUG : 01_DES_MATERIAL/.DS_Store: Excluded
2022/10/09 13:03:29 DEBUG : 01_DES_MATERIAL/TEST/.DS_Store: Excluded
2022/10/09 13:03:29 DEBUG : S3 bucket ba-despues-2022: Waiting for checks to finish
2022/10/09 13:03:29 DEBUG : S3 bucket ba-despues-2022: Waiting for transfers to finish
2022/10/09 13:03:29 ERROR : 01_DES_MATERIAL/TEST/2022-09-11 03.30.07.mp4: Failed to copy: AccessDenied: Access Denied.
status code: 403, request id: 171C78A46F0BCE7D, host id:
2022/10/09 13:03:30 ERROR : 01_DES_MATERIAL/TEST/2022-09-16 15.07.13.mov: Failed to copy: AccessDenied: Access Denied.
status code: 403, request id: 171C78A478DB5ED4, host id:
2022/10/09 13:03:30 ERROR : 01_DES_MATERIAL/TEST/2022-09-19 13.19.03.mp4: Failed to copy: AccessDenied: Access Denied.
Using the config wihout the sse_customer_algorithm and _key (i.e., using the defaults by the provider):
2022/10/09 13:07:27 DEBUG : --max-age 99y to 1923-11-03 11:07:27.559262 -0700 MST m=-3122063999.987137749
2022/10/09 13:07:27 DEBUG : rclone: Version "v1.59.2" starting with parameters ["./rclone-arm64" "sync" "../" "e2-ba-despues:/ba-despues-2022/" "--config" "./rclone.config" "-P" "--fast-list" "--max-age" "99y" "--exclude-from" "./exclusiones.txt" "--checkers=24" "--stats=1s" "--log-file=/Users/rodrigo/Dropbox/Trabajo/Brujazul/2022/Despues/99_DES_UTILERIAS_NUBE/logs/backup_completo-2022-10-09T130727.log" "-vv" "--ask-password=false"]
2022/10/09 13:07:27 DEBUG : Creating backend with remote "../"
2022/10/09 13:07:27 DEBUG : Using config file from "/Users/rodrigo/Dropbox/Trabajo/Brujazul/2022/Despues/99_DES_UTILERIAS_NUBE/rclone.config"
2022/10/09 13:07:27 DEBUG : fs cache: renaming cache item "../" to be canonical "/Users/rodrigo/Dropbox/Trabajo/Brujazul/2022/Despues"
2022/10/09 13:07:27 DEBUG : Creating backend with remote "e2-ba-despues:/ba-despues-2022/"
2022/10/09 13:07:27 DEBUG : fs cache: renaming cache item "e2-ba-despues:/ba-despues-2022/" to be canonical "e2-ba-despues:ba-despues-2022"
2022/10/09 13:07:27 DEBUG : .DS_Store: Excluded
2022/10/09 13:07:27 DEBUG : 99_DES_UTILERIAS_NUBE: Excluded
2022/10/09 13:07:27 DEBUG : 01_DES_MATERIAL/.DS_Store: Excluded
2022/10/09 13:07:27 DEBUG : 01_DES_MATERIAL/TEST/.DS_Store: Excluded
2022/10/09 13:07:27 DEBUG : S3 bucket ba-despues-2022: Waiting for checks to finish
2022/10/09 13:07:27 DEBUG : S3 bucket ba-despues-2022: Waiting for transfers to finish
2022/10/09 13:07:28 DEBUG : 01_DES_MATERIAL/TEST/2022-09-11 03.30.07.mp4: md5 = 9424a0f7d030adbde782de24f1060bcd (Local file system at /Users/rodrigo/Dropbox/Trabajo/Brujazul/2022/Despues)
2022/10/09 13:07:28 DEBUG : 01_DES_MATERIAL/TEST/2022-09-11 03.30.07.mp4: md5 = a11f8672924d577b8bc29c847aab0b5a (S3 bucket ba-despues-2022)
2022/10/09 13:07:28 ERROR : 01_DES_MATERIAL/TEST/2022-09-11 03.30.07.mp4: corrupted on transfer: md5 hash differ "9424a0f7d030adbde782de24f1060bcd" vs "a11f8672924d577b8bc29c847aab0b5a"
My guess is that the IDrive implementation in rclone doesn't work when using SSE-C encryption, so that is my question, is it compatible, if so, how should I use it to be able to pass my own keys?
# upload local file to idrive with sse-c
+ rclone copy ./source/file.ext idrive_with_ssec:bucket -vv
DEBUG : rclone: Version "v1.59.2" starting with parameters ["rclone" "copy" "./source/file.ext" "idrive_with_ssec:bucket" "-vv"]
DEBUG : file.ext: Need to transfer - File not found at Destination
DEBUG : file.ext: md5 = c7f5af9b93f5aa17934c84ad53fd2cea OK
INFO : file.ext: Copied (new)
# download from idrive with sse-c
+ rclone copy idrive_with_ssec:bucket ./dest -vv
DEBUG : rclone: Version "v1.59.2" starting with parameters ["rclone" "copy" "idrive_with_ssec:bucket" "./dest" "-vv"]
DEBUG : file.ext: md5 = c7f5af9b93f5aa17934c84ad53fd2cea OK
INFO : file.ext: Copied (new)
# fail to download from idrive without sse-c
+ rclone copy idrive_without_ssec:bucket ./dest -vv
DEBUG : rclone: Version "v1.59.2" starting with parameters ["rclone" "copy" "idrive_without_ssec:bucket" "./dest" "-vv" "--retries=1"]
ERROR : file.ext: Failed to copy: failed to open source object: InvalidRequest: The object was stored using a form of Server Side Encryption. The correct parameters must be provided to retrieve the object.
status code: 400, request id: 171C81029EA0F835, host id:
I haven't been able to make it work. I have created new buckets, new access keys, but I always get the same symptoms as originally posted. Not really sure what I am doing differently for this to fail... I'll make some more tests tomorrow.
Question: did you enable the "default encryption" toggle on the iDrive interface when creating the new bucket? I have tried with both, and changed that setting afterwards also, with no difference, but just thought to ask.
no idea what that is.
i have a free idrive account that i use for testing only.
it has been a long time since i logged into the web interface.
but since you asked, just now i logged in and enabled default encryption for my idrive bucket.
i re-ran the same exact script and got the same exact output.
so that does not seem to be the issue.
What I don't know is: does the Key should be exactly the text I used, or something else made using as base the text I set? that's the only thing I see as not expected, but being the first time I use SSE-C wherever, I don't know what to expect.
Edit: another difference is the UNSIGNED-PAYLOAD in the header. In the headers sent by @asdffdsa there is the hash of the content, here it's not. This are big files (and will be bigger on the real-life scenario)... don't know if that could have something to do? I will test with smaller files and see what happens...
Edit 2: I tried uploading a small text file and I still get the same UNSIGNED-PAYLOAD header and the same error. Removing the key and the algorithm from rclone config works: the file is uploaded, but that header still reads the same: UNSIGNED-PAYLOAD:
The problem was in fact the key: previous ones were not 32 characters, don't remember if they were less or more, but using a 32 length string made the problem go away.
I don't need the base64 encoding for the key right now, but it's good to know that it is possible.