--drive-service-account-file applying to both src and dest remotes

What is the problem you are having with rclone?

When I use --drive-service-account-file as part of my rclone copy command, it is using that service account for BOTH the source remote as well as the destination remote.

What I want is to use the rclone conf settings for the source remote, but use the --drive-service-account-file parameter for the destination remote (so that I can programmatically change the service account).

Is there a way to specify different accounts for the source and destinations, from the command line? Can I have TWO --drive-service-account-file parameters, one for source and one for destination, in the command? Or is there a way to ensure that the --drive-service-account-file parameter only applies to the destination remote, leaving the source remote to use the credentials in the config file?

What is your rclone version (output from rclone version)

1.55.1

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Win10 64 bit

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy src_remote(obfuscated):src_path(obfuscated) dst_remote(obfuscated):dst_path(obfuscated) --drive-stop-on-upload-limit=true --multi-thread-streams=16 --progress --progress-terminal-title --retries 99 --transfers=16 --checkers=16 --create-empty-src-dirs --fast-list --drive-service-account-file SAs\SA_file(obfuscated).json

The rclone config contents with secrets removed.

[src_remote]
type = drive
client_id = <obfuscated>
client_secret = <obfuscated>
scope = drive
token = {"access_token":"<obfuscated>","token_type":"Bearer","refresh_token":"<obfuscated>","expiry":"2021-06-25T15:02:23.4857404+10:00"}
team_drive = <obfuscated>
root_folder_id = 

[dest_remote]
type = drive
scope = drive
service_account_file = <obfuscated>.json
team_drive = <obfuscated>
root_folder_id = 


A log from the command with the -vv flag

When I specify the SA file in the command, I get "source directory not found" (this is because it's trying to use the SA file for both the source and destination remotes, not just for the destination remote.
When I don't specify the SA file in the command, the command works perfectly (because it is grabbing the correct credentials for EACH remote from the config file).

To answer my own question:

Yes, it IS possible to specify a parameter for only one of the remotes on the command line. The format is under the rclone docs section "Connection Strings".

In my case, the format that worked is:
rclone copy src_remote(obfuscated):src_path(obfuscated) "dst_remote(obfuscated),service_account_file=SAs\SA_file(obfuscated).json:dst_path(obfuscated)" --drive-stop-on-upload-limit=true --multi-thread-streams=16 --progress --progress-terminal-title --retries 99 --transfers=16 --checkers=16 --create-empty-src-dirs --fast-list

FYI - the advantage of this approach is that you don't need heaps of remotes in your config file, just one remote per Team Drive, then just specify whatever Service Account you want to use in the command line itself (and change it programmatically cycling through your SAs each time you hit the 750GB daily upload limit).

Update:
This solution doesn't seem to work with crypt remotes (my guess is that because crypt doesn't have the service_account_file parameter, it is dropping it and not passing it through to the wrapped gdrive remote).

Solution:
Update the config file on the fly using "rclone config update remote: service_account_file path-to-SA-file", then call the rclone copy as normal without using connection strings etc eg "rclone copy src_remote:path dst_remote:path"

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.