Syncing multiple users to respective counterparts on different domains/accounts

on windows
set RCLONE_CONFIG_DOMAINA_IMPERSONATE=user1@domaina.com

also, for testing, you can use https://rclone.org/docs/#n-dry-run

Sorry, the commands I gave were for linux.

With Windows you'll need to do this I think (not a Windows expert!)

set RCLONE_CONFIG_DOMAINA_IMPERSONATE user1@domaina.com
set RCLONE_CONFIG_DOMAINB_IMPERSONATE user1@domainb.com
rclone copy domaina: domainb:

I don't think you can do a one liner.

Than you very much to the both of you,

I'll be testing it a little later and be reporting back.

In the meantime I tried using the following command to see if it works.

I added 1 remote only for testing first.

rclone -v --drive-impersonate admin@mydomain.com lsf remote:

But it returns an error saying 401 not authorized.

when I created the remote config, I had added the client ID, left secret blank and provided a json path. After getting this error, I went ahead and deleted the ClientID line from the config, but still get the same error.

Not sure what's wrong. Any suggestions?

I'm gonna try and create a new client ID and re-insert the ClientID in the config, but any input in the meantime would be appreciated.

Thank you very much for all the help you've provided so far.

If your objective were to change domains but keep all of the same users and setup, gsuite has the capacity to add domains to an existing account. And to change the primary domain and/or delete the old primary domain.
https://support.google.com/a/answer/7009324?hl=en
https://support.google.com/a/answer/54819?hl=en
https://support.google.com/a/answer/1041297?hl=en
Just fyi. :wink:

I'm trying to merge domains onto a single new account. so I need to migrate the data from Drive one way or the other.

That is exactly what the linked process will do.

I realize we are in the rclone forum, so this will be my last non-rclone related advice on the subject unless there is a specific question... but the above methodology really is the right way to do what you have described so far.

Nobody has discussed the transfer limits yet, so I should mention those as well.

While the docs say that "Individual users can only upload 750 GB each day between My Drive and all shared drives." I assume this applies to service accounts as well (or else everyone would just use a service account to circumvent the 750GB cap).

yes, the third links does define how to transfer data and if you notice for Google Drive for data transfer as to be done through API or downloading and re-uploading (thing i want to avoid at all cost). Hence why i'm trying to use RClone to transfer the data from one domain to another.

so the "set" command for impersonate seems to have worked as I didn't get any errors when I executed it, simply went back to the command prompt.

When i execute "rclone lsf remote:" I keep getting error (pasted below), I tried editing the remote and putting the client id back in, same error.

C:\rclone>rclone lsf remote:
2020/01/24 14:11:57 Failed to create file system for "remote:": couldn't find root directory ID: Get https://www.googleapis.com/drive/v3/files/root?alt=json&fields=id&prettyPrint=false&supportsAllDrives=true: oauth2: cannot fetch token: 401 Unauthorized
Response: {
"error": "unauthorized_client",
"error_description": "Client is unauthorized to retrieve access tokens using this method, or client not authorized for any of the scopes requested."
}

Don't know what to try at this point.

Khar00f,

There are at least a dozen ways to tackle what you are trying to do. Which is a good thing. :wink:

Couple of questions:

  • How many users are you trying to migrate?

  • Do you have access to directly auth each of their accounts if you wanted to? (as opposed to --drive-impersonate)

  • Separately, are you familiar with Shared Drives and how to create/use them?

  • You have Super Admin access for both the old and new account, I think you already said?

Asking because the answers can influence which solutions are most effective for you.

yes, the third links does define how to transfer data and if you notice for Google Drive for data transfer as to be done through API or downloading and re-uploading (thing i want to avoid at all cost). Hence why i'm trying to use RClone to transfer the data from one domain to another.

Just to be clear -- I believe you are conflating accounts and domains. Moving from joe@old-domain.com to joe@new-domain.com is migrating a domain. Moving from joe@school.edu to joe@gmail.com after Joe graduates is moving an account. The first link explains how you can just turn joe@old-domain.com in to joe@new-domain.com without needing to copy anything whatsoever. You do not need to do the steps in the third link to make the data follow the user when migrating a domain.

Anyway--with that out of the way, it appears your service account isn't properly configured. Check the Admin Console and make sure it has access to the contents of the Drives. You will get unauthorized_client no matter what if your scope is wrong, even though that particular error sort of implies it's a token issue... so that's kind of counterintuitive.

Also make sure you aren't mixing regular user credentials and SA credentials in your rclone.conf.

Posting your rclone.conf would help (with redacted passwords/personal info). Based on the debug output you posted, this appears to be a gSuite configuration issue to me.

Replicated the error messages you were receiving just to be sure it was what I was thinking.

You haven't granted access to impersonate via the API to the service account you're using basically.

Impersoantion is a multi-step process, mainly covered in the guide, but might not be as concise....

  • Use current SA or make a new one
    Project owner is not needed
    DOMAIN-WIDE delegation IS needed

  • In order to add domain-wide delegation now you have to make the service account, and then you have to re-enter it for editing

  1. Up top click edit

  2. Middle of the screen between " Service account status" and "Keys" you'll see "Show domain-wide delegation"

  3. Make sure that this is enabled! gives your SA a client-id
    {This is not the same as an OAUTH client-id}

  4. This "Client-id" is what you will then enter Here
    [navigation steps below]

Go to SecuritySettings in the admin panel. Third tab/card/whatever from the bottom should be "Advanced Settings" - when you click that you'll see a link to Manage API client access
same as before, this is not the same as "APIs and Services"

  1. In the first field enter that client-id you got once domain-wide-delegation was enabled.
    Column label is " Authorized API clients"
    Box label is "Client Name"
    [disregard the "Example: www.example.com" part]

  2. In the second field You have the enter the scopes you want to allow. IDK why I use the ones I list, but it works so eh...
    Box label is "One or More API Scopes"
    PAY ATTENTION TO "(comma-delimited)"

  3. Scopes that I use most of the time (singleLine)

https://www.googleapis.com/auth/admin.directory.domain,https://www.googleapis.com/auth/calendar,https://www.googleapis.com/auth/drive

These will then refelct as:
https://www.googleapis.com/auth/admin.directory.domain
Calendar (Read-Write) https://www.googleapis.com/auth/calendar
https://www.googleapis.com/auth/drive

1 Like

I know what you're referring to and there's a reason why i can't change the domains, I really have no choice but to move them to a brand new account. I can't use any existing account (without going into the details), hence why the data has to be migrated. But that's outside the scope of this thread.

I'm not sure I understand what you mean by the service account having access to the contents of the drives.

I don't have an account for service account it's the super admin account i'm using.

As for the scope, there's only 1 address I added in there based on the config instructions:

2. Allowing API access to example.com Google Drive
  • In the next field, “One or More API Scopes”, enter https://www.googleapis.com/auth/drive to grant access to Google Drive specifically.

Also make sure you aren't mixing regular user credentials and SA credentials in your rclone.conf.

I don't have any credentials in the rclone.conf, I don't have access to it now to post it, I'll do it on Monday or Tuesday. I thought the whole point of putting a JSON is that you don't have to authenticate a user so credentials aren't needed.

What I do have in the conf under the [remote] is scope and type, JSON path, there maybe one or two lines missing (going off memory).

That was enabled, I do remember there's an option for apptype (might have the filed wrong here) and I left it blank based on the config instructions, and I remember reading somewhere that someone said to set it to "other" so not sure if this has an impact.

That I'll have to double check when i go back to work on Monday or Tuesday.

I only put 1 scope as per the instructions (pasted above)

I'll have to double check some of the info you guys have provided me,

Thank you for your input, i'll report back Monday or Tuesday after I've done my verification as I don't have access to those details outside of work.

This is your issue. Super Admin can not directly access users' files. You you can list and audit, but you won't be able to pull the actual file. That is why you're getting unauthorized_client.

Check out this page for a starting point/summary of service accounts and what you need to delegate--you don't need the part about making API calls obviously:
https://developers.google.com/identity/protocols/OAuth2ServiceAccount

There is another alternative that I think you should consider--gsuite has a migration tool of its own. I strongly suggest you look at that as a backup to the rclone solution you're building, if this one seems to not work the way you need it to, as it will a) not use your bandwidth when copying and b) not be subject to the transfer caps. It involves a similar setup to what you're trying to do with rclone, but it will run completely server-side (for all I know they're even using some rclone code on their backend, I haven't read the license for gsuite lol). It's in your dashboard.

A lot of people use it for migrating non-gsuite to g-suite. But you can use it for g-suite to g-suite.

That does solve your "must be a new account issue" and, the main reason I think it's worth mentioning, is that it meets your initial requirement of using a CSV to specify source and destination, which I don't think anyone has offered a solution for as of yet. You can then use rclone check to make sure you got the results you wanted.

I'll look into that thank you.

That option doesn't work for my needs

I did some additional testing and checked for service account, here's mind findings

I created anew service account to make sure it was done properly

So i did create the service account
Domain wide delegation is needed

Doing it on linux this time (using home machine).
So when i try to impersonate, I get this error

rclone -v --drive-impersonate admin@domaina.com lsf remote:
2020/01/26 13:05:54 Failed to create file system for "remote:": couldn't find root directory ID: Get https://www.googleapis.com/drive/v3/files/root?alt=json&fields=id&prettyPrint=false&supportsAllDrives=true: oauth2: cannot fetch token: 401 Unauthorized

When i paste that link in browser, here's what i get.

{"error":{"errors":[{"domain":"usageLimits","reason":"dailyLimitExceededUnreg","message":"Daily Limit for Unauthenticated Use Exceeded. Continued use requires signup.","extendedHelp":"https://code.google.com/apis/console"}],"code":403,"message":"Daily Limit for Unauthenticated Use Exceeded. Continued use requires signup."}}

i think it's pretty clear my issue is with authenticating but i can't find anything anywhere about authenticating.

Other than providing the json file dunno what else to do.

I don't know what I'm missing at this point

Here's my Rclone Conf along with pics of my configs.

Now I'm just spitballing, but make sure the google drive API is actually enabled. That's the one step I don't see in your screenshots.

And also, you blanked out the name of the remote, but you are using whatever text is between the brackets in your config file as the remote name, aren't you? Failed to create filesystem errors can occur if you specify a remote that doesn't really exist, that's why I ask. Although I've never seen them in connection with an authentication error. I'll see if I can replicate this and get back to you.

yes absolutely, in regards to the remote I'm using the proper name.

The only reason I blacked it out is because it's an acronym and realized after I wrote it, that it could be perceived negatively, so i figured to avoid any drama or misunderstanding I'll have it blacked out.

For the drive API, it def is enabled.

is this the only API scope I need to enable?
https://www.googleapis.com/auth/drive ?? Because it's always only been this one that I add.

EDIT:

OK, GOOD NEWS

After fiddling and creating multiple new remotes and testing, I figured out the issue, but I don't understand WHY it's happening.

fix

I've isolated the issue to selecting the type of scope in rclone.

When i use drive.readonly it gives me the 401 Token error.

When i use drive (full access), then it works.

If you're able to tell me why it works only as drive full access it would be great, but at least I figured out what the issue is.

Thank you very much for your assistance.

I think it’s because you have to match the scope to the access. See here:
https://developers.google.com/identity/protocols/googlescopes#drivev3

If you want to fiddle with it, try changing the scope to https://www.googleapis.com/auth/drive.readonly and see what happens. You should be able to use read only access then.

Sorry, I didn’t realize you were requesting read-only...

Also I notice you switched from using the environment variable to the —impersonate-user argument. Does your command still work if you go back to the environment variable or does that still throw errors (curious for my own use).

I tried using both to see results, what I realized though is that I always have to go back and delete the root file tree from the config everytime I use the environment variable. executing a new "set RCLONE_CONFIG_DOMAINA_IMPERSONATE..." wouldn't work unless file tree in the config is blank.

You can use unset instead of set, and then run set again.
Or chain them in to one line.

What I noticed though is that when i use the command set, the config doesn't get updated with root folder ID

It only updates the root folder ID when i do a command regarding the remote afterwards.

When i tried the command unset it said that it's not recognized as an internal command

I'm also having an issue the command move which is basically the second step of this thread:

Since there was a similar thread, I simply added to it.

If you think you can help, let me know.

is using rclone mount an acceptable option?

you could:

rclone mount remote: ~/rclone -v
cd ~/rclone
mkdir testmove
mv * testmove

It will throw an error, but will move the files anyway.

NB:
Try this on a non-production something-or-other before diving in face first. And don't unmount until you see that rclone mount is done doing its thing. It is not instantaneous.

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.