[Newbie] Assign multiple users to single remote

What is the problem you are having with rclone?

I have a very large (14TB) project folder I want to copy to a google team drive. I have 5 user accounts with Gsuite, each with their respective 750GB daily upload limit so it seems silly to have only one user run the copy process. I've been syncing to Gsuite for the last month now and for each project I've created a new google team drive and assigned a unique user id and secret in the rclone config, as per forum recommendations to avoid userratelimit exceeded API errors. When creating the config file, will I need to assign multiple client IDs and secrets in addition to assigning multiple gsuite users?

What is your rclone version (output from rclone version)

v1.52.0

Which OS you are using and how many bits (eg Windows 7, 64 bit)

macOS Mojave 64 bit

Which cloud storage system are you using? (eg Google Drive)

Google Drive (specifically shared drives)

The command you were trying to run (eg rclone copy /tmp remote:tmp)

command unknown

The rclone config contents with secrets removed.

config not set up yet, although typically it looks like this

[remote name]
type = drive
scope = drive
Client ID: = #######
Client secret = #######
token = {########}
team_drive = ########

Use a shared folder / teamdrive

I think thats what I'm doing? In the config process it asks me to assign a gsuite user on creation of the remote in rclone. So in theory isn't it "locked" to that user until I change it in the config? I assuming doing another config while I have another copy process running in a different shell would be no good. Do I just make 4 other remotes that all point to the same gsuite shared drive?

You create a shared drive, add your 4 users, and create a rclone config for each user, pointing to the same shared drive

1 Like

Thanks, I figured I must've been overthinking it.

@random404 I came back to my system to discover all my rclone instances had been forced to stop due to critically low system memory. On further investigation I discovered instead of syncing to the google drive, each process that was started after the first rclone sync was syncing to the local user hard drive (Macintosh HD/Users/[user name]). I cleared the space and restarted the sync processes one by one to confirm.

First process (correctly working)
rclone sync /volumes/202_FILMS 202_FILMS_Sam: --exclude "202 Raw Media/**" --bwlimit 8.65M --max-backlog=60000 -q -P

Second process (I have to stop each time as I see it syncing to the internal drive)
rclone sync "/volumes/202_FILMS/202 Raw Media" "202_FILMS_edit1/202 Raw Media": --bwlimit 8.65M --max-backlog=60000 -q -P

I've enumerated the remotes to reflect the user account associated with the transfer here (edit1 and Sam). In the config process each is assigned its own client ID and secret as mentioned above, with type and scope listed as "drive". Both are pointing to the same google team drive. Any ideas?

Folder on the remote should be specified like 202_FILMS_edit1:"202 Raw Media" (Note the colon after the remote name instead of after the full path)

1 Like

Thanks, that did the trick. One of these days I'll get the syntax down!

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.