I'm trying to copy full S3 bucket into my local harddisk and rclone works perfectly fine. But, some of my files have larger than 255 characters and I can't write them into my EX4 local disk due to limits. I would like to use Rclone filter feature but I can't filter the ones with larger than 255 characters in their name.
What is your rclone version (output from rclone version)
rclone v1.50.2
os/arch: linux/amd64
go version: go1.13.6
Which OS you are using and how many bits (eg Windows 7, 64 bit)
Linux Ubuntu 18.04 - 64 bit
Which cloud storage system are you using? (eg Google Drive)
Backblaze B2 or Amazon S3
The command you were trying to run (eg rclone copy /tmp remote:tmp)
what about files in subfolders?
not sure that script will work, as that includes the entire path of the file.
i think you need to make sure each segment of the full path is less than 255, including the file name....
also, that script outputs paths/files that are greater than 255, whereas you want path/files that are less than 255