#### What is the problem you are having with rclone?
The current source location has deeply nested hierarchies and in some cases long basename/filenames. When copying files across to a OneDrive encrypted remote, it generates the error Failed to copy: invalidRequest: pathIsTooLong: Path exceeds maximum length.
Flattening the hierarchy or changing file names isn't an option given the sheer volume of files including operational knowledge associated with the structures.
Is there a way to truncate or trim file names so that they don't hit the ceiling limit e.g. 143 characters?
If not, what other native options are available in rclone?
What is your rclone version (output from rclone version)
rclone v1.53.2
Which OS you are using and how many bits (eg Windows 7, 64 bit)
Microsoft Windows 10 Professional version 1909 Build 18636.1110
Which cloud storage system are you using? (eg Google Drive)
Microsoft OneDrive
The command you were trying to run (eg rclone copy /tmp remote:tmp)
How to encrypt the filenames.
Choose a number from below, or type in your own value
1 / Don't encrypt the file names. Adds a ".bin" extension only.
\ "off"
2 / Encrypt the filenames see the docs for the details.
\ "standard"
3 / Very simple filename obfuscation.
\ "obfuscate"
Thanks @ncw. Would you recommend changing the chunking configuration to validate if the path length remains an issue? If yes, what settings do you suggest?
Thanks @ncw. If using --chunker-name-format does rclone's configuration need to change? At the moment, the settings are a OneDrive remote and an encrypted remote.
Righto @ncw. To clarify, do you mean that chunkers have not effect unless a chunker backend has been configured?
Assuming there is no other option including obscuring filenames, can rclone decompress remote files or traverse them? For example, if a 7z or zip archive is copied across to a remote, can rclone open the archive (without extracting)? This is so that if users wish to search for files with longer names that have been compressed, they are able to do so without first downloading the archive and then uncompressing it.
Alternatively, is it possible to assign shortened aliases to files with longer names?
i have large .7z files in cloud, and i am able to extract a single file without downloading the entire .7z
i use a rclone mount so that the .7z appears as local file.
same as with any .7z file.
once mounted, the .7z looks and acts like a local file and behaves the same.
my .7z are always encrypted.
so if i want to extract a single file from the mount to my local computer, i will be asked for the password.
no need to decrypt the entire .7z, no need to download the entire .7z
If the 7z archive isn't encrypted but the remote is, would the extraction remain encrypted? I'm assuming it does.
How are you maintaining a list of files and the archive they belong to? I'm attempting to find a balanced solution that an average user could follow with a few simple steps.
if you extract a file from that non-crypted .7z which is inside a rclone crypted remote to another location in that remote, then yes, the extract file is crypted by rclone.
i create the .7z files locally, as backups of local files then upload them to rclone.
i can extract file(s) as needed.
Thanks @asdffdsa. Assuming no local copies are maintained as storage is an issue and a 7z archive is copied across to an encrypted remote, what would you recommend so that a catalog of the list of files is known for a given archive?
You can mount the backend and then use something to look in the zip - it shouldn't have to download the zip. I did make a backend which does this for zip files which I haven't released yet.