Running rclone for The Files Less Then 255 Characters

What is the problem you are having with rclone?

I'm trying to copy full S3 bucket into my local harddisk and rclone works perfectly fine. But, some of my files have larger than 255 characters and I can't write them into my EX4 local disk due to limits. I would like to use Rclone filter feature but I can't filter the ones with larger than 255 characters in their name.

What is your rclone version (output from rclone version)

  • rclone v1.50.2
  • os/arch: linux/amd64
  • go version: go1.13.6

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Linux Ubuntu 18.04 - 64 bit

Which cloud storage system are you using? (eg Google Drive)

Backblaze B2 or Amazon S3

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy "s3:bucket-name:'[^ ]{255,}'" . --filter

hello and welcome,

not sure there is such as filter, at least not documented.

one workaround is do get a list of files using
rclone lsf s3:bucket-name: --files-only > file.txt

and then write a script to process the files in file.txt

also, that version of rclone is years old, best to update first.
https://rclone.org/downloads/#script-download-and-install

Thank you! What do you recommend me to use to download from a list of file?

  1. create file.txt
  2. write a script to read each line of file.txt, and for each file you want to copy, write that to shortlist.txt
  3. rclone copy s3:bucket-name: . --include-from=shortlist.txt -v --dry-run
  4. if the output of that command looks good, then remove --dry-run and run the command again so copy the files to local.

i am not a linux expert, so i am sure there is a more efficient way to do this, but what i wrote should work.

Thank you! I did not know about --include-from operator.

In addition to what you do in step 2 - I've limited them by characters by using awk:

cat file-list.csv | awk 'length($0) > 255'

It probably solves my problem :tada:

yeah, i found that also stackoverflow.com.

what about files in subfolders?
not sure that script will work, as that includes the entire path of the file.
i think you need to make sure each segment of the full path is less than 255, including the file name....

also, that script outputs paths/files that are greater than 255, whereas you want path/files that are less than 255

You are right! I've assumed fix length for each subfolder - but modifying awk to only get string counter after latest slash would work perfectly fine.

can you post the command that worked?

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.