Failed to get bucket: googleapi: Error 403: xyz does not have storage.objects.list access to bucket., forbidden

Hi,

I am a first time user of Google Cloud platform and also for rclone.
So i seem to have gotten through most initialization procedures.

  1. Signed up for the free trial of the google cloud platform
  2. Installed rclone on a CentOS 6.9 system
  3. I seem to have created the remote fine following the necessary instructions. I have access to the system remotely so i had to visit the specified url and explicitly give rclone access to my gmail account and then type in the code etc.

I have the following output towards the end

[remote]
type = google cloud storage
client_id =
client_secret =
project_number = project1
service_account_file =
object_acl =
bucket_acl =
location =
storage_class =
token = {…snip…}

y) Yes this is OK
e) Edit this remote
d) Delete this remote
y/e/d>y

Current remotes:

Name Type
==== ====
remote google cloud storage

rclone lsd remote:
The above works fine and no output since there are no buckets.

But trying to create a bucket fails

rclone mkdir remote:bucket
2018/06/06 17:49:51 ERROR : Attempt 1/3 failed with 1 errors and: failed to get bucket: googleapi: Error 403: xyz does not have storage.objects.list access to bucket., forbidden
2018/06/06 17:49:51 ERROR : Attempt 2/3 failed with 1 errors and: failed to get bucket: googleapi: Error 403: xyz does not have storage.objects.list access to bucket., forbidden
2018/06/06 17:49:51 ERROR : Attempt 3/3 failed with 1 errors and: failed to get bucket: googleapi: Error 403: xyz does not have storage.objects.list access to bucket., forbidden
2018/06/06 17:49:51 Failed to mkdir: failed to get bucket: googleapi: Error 403: xyz does not have storage.objects.list access to bucket., forbidden

I have checked the IAM permissions. xyz is the owner for the project.

Any help with this is greatly appreciated.

Thank you.

Bucket names are globally unique, so you’ll have to choose your own unique name instead of bucket. I think that is probably the problem.

1 Like

Yes, i should have tried that out first. Works like a charm. Thanks a lot

rclone mkdir remote:bucketvtl1
rclone sync /tmp/yum.log remote:bucketvtl1
rclone ls remote:bucketvtl1
315 yum.log

I should probably change the docs to make sure that users know they need to choose a unique name… This is true for s3, gcs but not for swift, azureblob (I think)…