Google Cloud sync to sub-folder in bucket?

I’m trying to handle image uploads for about 1000 customers, rclone will be handling ~200k-1m image files per user, and keeping images in sync.

Question for you though. I’ve been trying to write my files to a GCS bucket.

I set up a remote, using our GCS json key and service account. I can list buckets and files no problem!

I’m trying to have a single bucket, lets call it img.bucketname.com, with a subfolder for each customer, where their inventory images are stored.

example:

rclone --config myconfig.conf sync C:\PhotoDir\ MUI:img.bucketname.com/$CustomerID/

but when I try and run it, each line shows me the following error:

googleapi: Error 400: Bucket names must be at least 3 characters in length, got 0: '', invalid

Is it possible to sync a single local folder to a subfolder on google cloud platform? Any help would be appreciated.

I’m not sure if it’s the dots in the name or what, but this bucket is hosted, and all files are world accessible for web access. I have tried it both quoted and non quoted with no differences.

Nevemind I’m an idiot. I got it working just fine now.

turns out I had a stray slash after MUI:

It’s working great!

Is it possible to read from a samba share without mounting by chance?

Good!

Not at the moment. There is an issue about it here: https://github.com/ncw/rclone/issues/2042

As part of sync rclone does the necessary changes to the existing files/folders in google bucket. How can one get a list of changes done to the files in google bucket? Basically the ops applied to the existing google bucket.

If you run with -v rclone will tell you what it is doing.