Google Cloud Storage: could not find default credentials

What is the problem you are having with rclone?

When trying to list folders in Google Cloud Storage I am getting could not find default credentials error. Details bellow.

What is your rclone version (output from rclone version)

rclone v1.55.1
- os/type: linux
- os/arch: amd64
- go/version: go1.16.3
- go/linking: static
- go/tags: none

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Linux, Ubuntu 20.04, 64 bit

Which cloud storage system are you using? (eg Google Drive)

Google Cloud Storage

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone lsd gcs-auth:seldon-models

The rclone config contents with secrets removed.

[gcs-auth]
type = google cloud storage
env_auth = false
client_id = <redacted client id>
client_secret = <redacted client secret>

A log from the command with the -vv flag

$ rclone lsd gcs-auth:seldon-models -vv
<7>DEBUG : Using config file from "/home/rskolasinski/.config/rclone/rclone.conf"
<7>DEBUG : rclone: Version "v1.55.1" starting with parameters ["/home/rskolasinski/.asdf/installs/rclone/1.55.1/bin/rclone" "lsd" "gcs-auth:seldon-models" "-vv"]
<7>DEBUG : rclone: systemd logging support activated
<7>DEBUG : Creating backend with remote "gcs-auth:seldon-models"
Failed to create file system for "gcs-auth:seldon-models": failed to configure Google Cloud Storage: google: could not find default credentials. See https://developers.google.com/accounts/docs/application-default-credentials for more information.

Extra information

  1. Listing same folder works if I use anonymous = true as it is public one
[gs]
type = google cloud storage
anonymous = true

and

$ rclone lsd gs:seldon-models -vvv
<7>DEBUG : Using config file from "/home/rskolasinski/.config/rclone/rclone.conf"
<7>DEBUG : rclone: Version "v1.55.1" starting with parameters ["/home/rskolasinski/.asdf/installs/rclone/1.55.1/bin/rclone" "lsd" "gs:seldon-models" "-vvv"]
<7>DEBUG : rclone: systemd logging support activated
<7>DEBUG : Creating backend with remote "gs:seldon-models"
           0 2021-05-13 14:46:59        -1 alibi
           0 2021-05-13 14:46:59        -1 alibi-detect
           0 2021-05-13 14:46:59        -1 custom
           0 2021-05-13 14:46:59        -1 keras
           0 2021-05-13 14:46:59        -1 mab
           0 2021-05-13 14:46:59        -1 mlflow
           0 2021-05-13 14:46:59        -1 mlserver
           0 2021-05-13 14:46:59        -1 odcd
           0 2021-05-13 14:46:59        -1 openvino
           0 2021-05-13 14:46:59        -1 pytorch
           0 2021-05-13 14:46:59        -1 sklearn
           0 2021-05-13 14:46:59        -1 spacy
           0 2021-05-13 14:46:59        -1 tempo
           0 2021-05-13 14:46:59        -1 tensorrt
           0 2021-05-13 14:46:59        -1 test
           0 2021-05-13 14:46:59        -1 tfserving
           0 2021-05-13 14:46:59        -1 triton
           0 2021-05-13 14:46:59        -1 trtis
           0 2021-05-13 14:46:59        -1 xgboost
<7>DEBUG : 3 go routines active

  1. Using client id and secret works with mc tool
$ mc config host add gcs https://storage.googleapis.com <redacted client id> <redacted client secret>
$ mc ls gcs/seldon-models/
[2021-05-13 14:46:07 BST]      0B alibi-detect/
[2021-05-13 14:46:07 BST]      0B alibi/
[2021-05-13 14:46:07 BST]      0B custom/
[2021-05-13 14:46:07 BST]      0B keras/
[2021-05-13 14:46:07 BST]      0B mab/
[2021-05-13 14:46:07 BST]      0B mlflow/
[2021-05-13 14:46:07 BST]      0B mlserver/
[2021-05-13 14:46:07 BST]      0B odcd/
[2021-05-13 14:46:07 BST]      0B openvino/
[2021-05-13 14:46:07 BST]      0B pytorch/
[2021-05-13 14:46:07 BST]      0B sklearn/
[2021-05-13 14:46:07 BST]      0B spacy/
[2021-05-13 14:46:07 BST]      0B tempo/
[2021-05-13 14:46:07 BST]      0B tensorrt/
[2021-05-13 14:46:07 BST]      0B test/
[2021-05-13 14:46:07 BST]      0B tfserving/
[2021-05-13 14:46:07 BST]      0B triton/
[2021-05-13 14:46:07 BST]      0B trtis/
[2021-05-13 14:46:07 BST]      0B xgboost/

Are you running this on a machine in the google cloud? As far as I can see that is what the error is about.

You could try the steps here

No, I was running this from my dev workstation and that was the problem.

After doing

gcloud auth application-default login

it now works fine now :). Thanks!

Just wonder why for mc it was enough to provide the client id and secret

Great!

I can't say I fully understand Google Auth - its complicated! Though not quite as complicated as Amazon's which isn't quite as complicated as Microsofts...

Oh yeah, each provider has its own quirks which gives me headaches quite often :sweat_smile:

Hopefully it shouldn't cause problem when invoked from inside the k8s cluster running on GKE and now locally got it working fine as well. Need to just check if it will also work fine in local kind cluster as gcloud auth application-default login requires interactive login so may not be best for headless machines...

We're currently adopting rclone as our go to solution for storage initializers (that download ML models in init containers before the actual inference servers starts up) and it is amazing how much it simplifies everything.

1 Like

Hmm... one thing is still not clear for me. Rclone docs reads here

Application Default Credentials
If no other source of credentials is provided, rclone will fall back to Application Default Credentials this is useful both when you already have configured authentication for your developer account, or in production when running on a google compute host. Note that if running in docker, you may need to run additional commands on your google compute machine - see this page.

I'd understand that providing

[gcs-auth]
type = google cloud storage
env_auth = false
client_id = <redacted client id>
client_secret = <redacted client secret>

counts as providing a source of credentials?

Ok, I think it was already discussed here on forum and there even is github issue

I'd love some help with that if you'd like to have a go?

I'd love to but cannot promise when I will be able to find time...
I will follow on github issue if I manage to get somewhere with it.

We do hope that this is not blocking and, if a production case arise for it, it will work smooth as per your workaround provided here:

Note that you can put the entire JSON credentials in --gcs-service-account-credentials '{json blob here}'

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.