What is the problem you are having with rclone?
I am trying to transfer files between Google Drive and S3 that match a certain file name pattern (I am using the --include flag). I only want to files to be transferred into the root folder of my S3 bucket and the directory folders from Google Drive to be ignored
Run the command 'rclone version' and share the full output of the command.
- os/kernel: 21.6.0 (arm64)
- os/type: darwin
- os/arch: arm64
- go/version: go1.19.3
- go/linking: dynamic
- go/tags: none
Are you on the latest version of rclone? You can validate by checking the version listed here:
Which cloud storage system are you using? (eg Google Drive)
Google Drive and AWS S3
The command you were trying to run (eg
rclone copy /tmp remote:tmp)
rclone copy -P gdrive:RecTest s3:bucket --include "Copy*"
Sorry, that is not possible yet, it has been a wish for a while:
12:34PM - 22 Nov 18 UTC
#### What is your current rclone version (output from `rclone version`)?
Perhaps you can copy the files to a local folder, flatten them something like shown in
this post and then copy to the target - or use some of the other ideas in the same thread.
this should work
rclone lsf $source --include="Copy*" --files-only --recursive --absolute | xargs -I file rclone copy "$source/file" "$dest" -v --dry-run
that will create a set of source files, and pass it to xargs
@asdffdsa - is there a way to pass this list to an rclone copy command?
in your specific case, the answer is no, as rclone does not have a
lol, can try
in general, the answer is yes, with something like this
rclone lsf $source.... > files.lst
rclone copy $source $dest --include-from=files.lst
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.