InvalidObjectName: Object name contains unsupported charac ters

What is the problem you are having with rclone?

I have iDrive s3 storage configured to use in encrypted form (crypt remote), when I try to sync local data to remote (sync command below), on some files I'm getting: Failed to copy: InvalidObjectName: Object name contains unsupported charac ters. error with file paths around 200 chars, all english letters + syms.

I tried checking len limit on the remote:
rclone --crypt-filename-encoding=base64 test info --check-length idrive:rclone-length-test

// idrive
maxFileLength = 998 // for 1 byte unicode characters
maxFileLength = 499 // for 2 byte unicode characters
maxFileLength = 332 // for 3 byte unicode characters
maxFileLength = 249 // for 4 byte unicode characters

But when I do with idrive_crypt:

// idrive_crypt{SDK2N}
maxFileLength = 175 // for 1 byte unicode characters
maxFileLength = 87 // for 2 byte unicode characters
maxFileLength = 58 // for 3 byte unicode characters
maxFileLength = 43 // for 4 byte unicode characters

Why crypt endpoint has a different limit? isn't that same from s3 view point?
It shouldn't matter if I store encrypted or plain data.

Run the command 'rclone version' and share the full output of the command.

rclone v1.67.0
- os/version: ubuntu 24.04 (64 bit)
- os/kernel: 6.8.0-35-generic (x86_64)
- os/type: linux
- os/arch: amd64
- go/version: go1.22.4
- go/linking: static
- go/tags: none

Which cloud storage system are you using? (eg Google Drive)

iDrive e2 s3

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone --crypt-filename-encoding=base64 sync --checksum --server-side-across-configs storage/data/ idrive_crypt:

Unencrypted file name test when encrypted and then encoded using base64 becomes 7HJvG30EtDYzozH3Ryw4sQ

Hence using encryption and encoding usually limits your maxFileLength

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.