How to copy "new" sync files/directories to a separate directory on remote drive to organize it with a bash script

Thanks Sergejs!

Good explanation, pretty sure I got it right now :slight_smile:

I will make my answer a little brief and assume some script experience, so please ask if things are unclear.

I suggest you add --log-level=INFO --log-file=archive_sync.log to the sync to the archive, then the log file will among other things contain an INFO line for each file copied.

You will then be able to make a script, that first extract the new filenames (and their location in the archive) using sed and regex something like this:
https://forum.rclone.org/t/how-to-extract-file-names-after-move-command-is-successfully-done/33725/3

and then iterate over the list of new files to determine where to (optionally) place an extra (server side) copy using the following rules:

Here you also need (some) of the actual content of the file. You can either make a rclone copy from GDrive to a local temp folder and then do the wc -l, or perhaps it is possible and faster to do it directly on the output from rclone cat (perhaps using --head or --count).

When decided what to do you can do a server side copy something like this:

rclone copy GDrive:/FolderName/archive/some/folder/filename.csv GDrive:Folder2/Request

When you have extracted the pattern then just do a server side copy something like this:

rclone copy GDrive:/FolderName/archive/some/folder/filename.csv GDrive:/Folder2/Pattern1/adinfo/DDMMYYYY/Pattern1_adinfo_DDMMYYYY.csv

any non existing higher level directories will be created automatically.

Not sure I fully understand this, but perhaps it is just something like

rclone copy GDrive:/FolderName/PreambulaDDMMYYYY GDrive:/Folder2/Out_Folder/PreambulaDDMMYYYY

Hope I got it somewhat right and (most of) the above gives meaning to you :sweat_smile: