So I foolishly used two gsuites user accounts to upload 4TB of data to a team drive. Now I want to move it to the my drive of the main user. but


#1

So I got an error, I googled it. It warned me that the error was caused because 20 nested subdirectories is the maximum a team drive allows. I’m also worried a team drive might not hold more than 250,000 files and I have 2,000,000 files.
see:


see:
https://support.google.com/a/answer/7338880

Anyways, I was able to instantly use the gsuites google drive website to move one entire folder from the team drive to my drive. rclone lsd mydrive: shows the folder. However rclone cannot detect anything at all inside the folder, despite google drive shows 1000s of files.
The files are all encrypted by rclone, but I know I used the same encryption settings because the root folder name is being displayed correctly decrypted.

One thing I notice though is that all these files say:
Storage Used 0 bytes Owned by someone else

So, how do I migrate files from a team drive to a my drive and have rclone be able to interact with them after I’m done?
Has anyone seen this error before? I assume I could rclone move teamdrive: mydrive: although in theory that would be slow and make many api requests and use my 750gb daily quota limit I guess? It was foolish of me to think team drives would let me bypass those limits without any downsides I suppose.

I’m not sure if this is a bug, or what, it’s odd for google’s website to say all these files exist and are there, and for rclone to be able to see the root directory (although IIRC I created that root directory before I started the team drive gimmick, moved it to the team drive, then filled it there). It’s like rclone doesn’t have the permission? or ability to see files that are in my drive but that weren’t created by me.

One big clue i’ve discovered is that if I move a file from this new “ghost” folder in mydrive: to a new folder in mydrive: using gsuites website it warns me that users in the team drive will lose access to the file. Despite the fact the file isn’t in the team drive anymore, but is in my drive. Meaning this warning is pointless and other team members already had no way to access the file? Hmm.

Actually rclone DOES still have access to these files, but only through teamdrive: and only when accessing teamdrive: via user1. Despite the fact these files are supposed to be in user1’s mydrive: not their teamdrive:

edit: so, I’m going to rclone move teamdrive:thisfolder mydrive:alternatespot:thisfolder
However, this is using bandwidth. So it’s wastefully downloading the files? then reuploading them? despite the fact the google drive website can handle the entire task in one second simply by changing the layout of the data at all :-/

Using the google drive website to individually copy any of these files and then move them to mydrive:alternatespot:thisfolder and then rename them to remove the words “copy” instantly lets rclone see them and decrypt them, but there’s no “copy folder” function on gsuites so I guess I’ll just waste today’s and tommorrow’s bandwidth quota on this task. In the future I won’t use the team drive again. I’ll just upload different folders to user2’s mydrive area.

edit2: I just noticed it’s consuming almost all my cpu power to do this. duh. It’s decrypting and encrypting the data, I should just move the data without decrypting it or encrypting it. Restarted the rclone move undecrypted data to unencrypted destination, cpu usage is down, still downloading and reuploading the data though, oh well. This is the best I can do.

edit3: relevant clue about how team drives claim ownership of all their own files
https://support.google.com/a/answer/7374057?hl=en
I read a tip on reddit or rclone forum about making multiple user accounts to bypass the 750GB quota per day limit, and it absolutely did bypass that limit, but these headaches, wish I’d been warned! Should’ve never used the team drive, just stuck to multiple my drives.

edit4:
this appears to have been a huge mistake as rclone can no longer find this drive in the incorrect location (however I can find it where I wanted it in the first place), I should’ve used the website to move things back, then started the transfer. Or heck maybe I should’ve just never tried the manual transfer in the first place… maybe I just had to be patient and team drive would’ve automatically disowned that directory. I can’t tell if this is just the small portion of data I moved before I hit my quota, if it’s the entire directory/backup like I wanted in the first place… and I can’t rclone size mydrive: because I’m over my quota and blocked out. Once I get this all sorted out, I’ll probably try using team drive again, only this time after I use my instantaneous move command from team drive to my drive, i’ll just go afk for a day and see if that magically makes it work.

edit5:
TL;DR
Team Drives sucks as a feature on gsuites for the purposes of data backups. In fact I cannot really think of any reason for it to exist, given it’s countless limitations. (Actually inconclusive, see edit4).

edit6: IMO BEWARE. No one should ever use gsuites team drives as part of their backup solution. Yes eventually rclone was able to spot all the files I moved out of the old team drive location into my drive. That worked. After a long delay. HOWEVER when trying to continue to add files to this location I was getting the dreaded “The file limit for this Team Drive has been exceeded” error.
Meaning if I create files in team drive, then move them to my drive, then allow rclone to interact with them via mydrive they’ll STILL be subject to team drive’s draconian limits of 250,000 files. I just pray that the entire mydrive hasn’t been polluted by this error (fingers crossed).
HOWHOWEVER it seems that by filling my team drive, moving those files to mydrive via the website, waiting a while, and then moving those files with rclone move from mydrive to mydrive it allows for “serverside” copy using rclone.
I’m not sure if this is a good thing or not, it might keep these serverside moved files polluted with these teamdrive related ownership errors. At least by redownloading and reuploading the files I could be sure to free myself from that error.

edit7: so I’m still currently never planning to use gsuites website to move a gigantic folder from a team drive to a my drive. I am still however trying to save the backup from my earlier mistake. I successfully moved every file from the polluted mydrive:crypt1 to the clean new mydrive:crypt2 serverside moves made without using my own bandwidth.
After the move was finished i was missing 10,000 out of 44,000 files. That was yesterday. Today 5000 new files appeared in the polluted crypt. Which to me means google is still migrating files from the team drive to the my drive, despite having claimed it was instant. Which keeps placing new files in empty folders.

As a result rclone has given me some messages like:

2018/02/11 15:50:27 DEBUG : directory name: Rmdir: contains file: filename
2018/02/11 15:50:27 DEBUG : directory name: Failed to Rmdir: directory not empty
2018/02/11 15:50:27 DEBUG : other directory name: Removing directory
2018/02/11 15:50:32 DEBUG : other directory name: Removing directory
2018/02/11 15:50:32 DEBUG : other directory name: Rmdir: contains trashed file: other filename

now does this mean that I succesfully moved a file and afterwards the directory was deleted? as is intended? or does this mean that rclone rmdir on a directory that had a file added back into it, right as rclone moved the previous last file out of it? and if gsuites added a file into an empty directory right before rclone deleted it… was the rmdir successfully cancelled?


What happens with rclone move --delete-empty-src-dirs when after the last file is deleted from a directory a magical third party adds a new file to that directory?
Is there anyway to use rclone move and FORCE serverside movement to be disabled so that I download the files and reupload them?
ACD oauth proxy broken!
#2

I noticed that while doing the initial team drive support, that you’d upload stuff and all the API calls would apparently be OK, but it didn’t actually appear in the listings for some time afterwards. I was only doing small tests with ~100 files but it was noticeable.

I think rclone probably abandoned the rmdir because there was a file in it, so you’ll have to run the operation again to pick those up


#3

I’m definitely going to have to run it again, not just because of this file, but because of the 5000 other missing files that could appear tomorrow or even the day after that.

I’m glad rclone is well behaved, but I’m not 100% confident in this, because magical gsuites voodoo might try to add a file into a directory momentarily after it’s removed and throw an error back to google and give up. There’s absolutely no feedback or information anywhere I can find online or in my gdrive about these magically appearing files (although they are files I want to appear).

If/when this is all finished I’ll try to remember to post again, on the issue of if the data managed to be unharmed by all the hijinks.


#4

Hmm, it is all a bit strange! That could happen I guess…

Good luck!


#5

Every single file is now present and accounted for. I guess team drives do work. Although having google’s servers run a task for several days that rclone has no way to interface with, that also continually changes how rclone has to locate files… is…well kinda not worth it.

Also just because I have the right number of files doesn’t mean this whole bollocks didn’t break something, but hey, at least it’s the right number of files.


#6

That is good!

Cross fingers!