For copy or sync, can the destination path be dynamically generated from the source path?

What is the problem you are having with rclone?

I have source folders on Azure Blob storage that has the format <date>/<hour>/<event_type>. I would like to copy/sync these folders to S3, but have the destination folder be of the format <event type>/<date>/<hour>, where the folder structure is re-arranged, but the folder names are pulled from the source.
Is this possible? Thank you!

Run the command 'rclone version' and share the full output of the command.

rclone v1.64.2

  • os/version: ubuntu 20.04 (64 bit)
  • os/kernel: 5.15.133.1-microsoft-standard-WSL2 (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.21.3
  • go/linking: static
  • go/tags: none

Which cloud storage system are you using? (eg Google Drive)

Azure Blob Storage and S3

The command you were trying to run (eg rclone copy /tmp remote:tmp)

Don't have one. Wondering if if what I want to do is possible

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

n/a

A log from the command that you were trying to run with the -vv flag

n/a

not by rclone itself but will be very basic shell scripting to achieve it.

In simple terms:

Pass to your script $date, $hour and $event_type, and then:

rclone sync src:${date}/${hour}/${event_type} dst:${event_type]/${date}/${hour}

You can collect all available variables' values from source using rclone lsd and then loop over them.

1 Like

you would need to script it.

one option is to use rclone mount on the source and dest.
then use whatever local tool/script to rename/move the dir/files to dest

1 Like

Thanks @kapitainsky ! Yeah, I figured I would need some wrapper to do this, but was holding on to the hope I could do using just rclone. Thanks again for the response!

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.