How to copy specific files

Summary
I have a use-case where lsf is used to list all nodes on the remote, then a subset of resulting nodes is selected, and are fed to copy command using --include options. Note I'm only referencing the nodes at their root level, ie directories should be copied recursively. eg

$ rclone lsf remote:server/dir
file_1
dir_1/
dir_on_remote/
file_on_remote

$ rclone copy remote:server/dir /local/destination/dir/ \
    --include '/dir_on_remote/**' --include '/file_on_remote'

It's working okay, but the downside is the node names need to be sanitized beforehand in order to escape regex characters:

sed 's/[.\*^$()+?{}|]/\\&/g;s/[][]/\\&/g' <<< "$node_name_from_lsf_command"

Question
Is this a sane way of using rclone, or am I misusing --include flags for something they're not really meant to do?

rclone version

rclone v1.53.3-DEV
- os/arch: linux/amd64
- go version: go1.17.5

Storage system
FTP

hello and welcome to the forum,

  • that is an old version of rclone, best to update to v1.57.0

  • perhaps i do not fully understand your use case, but i would
    --- save the output of rclone lsf to a file.
    --- process the file.
    --- feed the file to rclone using --include-from

Ah nice that could work! In that case special/regex characters don't need to be escaped right?
Any chance to accomplish same without having to write to a file?

if rclone copy would accept the path, so should --include-from
should be simple to create a test case.
often the easiest way to test a filter is using rclone ls --include or rclone ls --include-from

yes, but that would be a linux question about using pipes, and i am not a linux expert.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.