The GD web interface returns an error (the little blue number of selected files to move changes to red and then it doesn’t work at all when I drag and drop it, do you know some way to cure this?) if i select to move more than 100-200 files at a time, this would mean I had to click literally 1000s of times to accomplish the task. In theory the correctly formatted find command will do it automatically. I’d have to mount it too though, and I’m not sure I can figure out both mount and find that well right now without help.
What you’re missing is that there’s no folders to drag and drop, just loose files. It’s literally dumping all the files into googledrive root, even if they were originally inside 4or5 subfolders, on top of that it’s making duplicates of files sometimes too.
I tried letting it move 1000files at a time, even though the number turned red. Google chrome the browser itself froze, said it was unresponsive and I clicked wait repeatedly to try to let it finish. If you’ve ever successfully moved 1000 loose files at once with the google drive web interface please write back and I’ll try it over again. This was one way in which the ACD web interface may have actually been better than GD’s!
edit2: somehow my google drive’s total space used has gone down by several hundred GB during this process.
I’ll explain what I told GD web interface to do. I told it to restore the already deleted forever files to my secondary gsuites user account. It seems to be restoring them from no where, into my secondary gsuites user account, the actions are being logged as caused by the primary gsuites account, but for some reason, it’s also vomitting all these files into the primary account’s root directory, loose. These files are in theory already backed up into primarygsuitesaccount:crypt:subfolders but I wanted to test this undelete and run another check. I hope it’s not dismantling my entire directory structure! which is the only explanation I could think of for an undelete to LOWER disk space usage.
edit3: googledrive doesn’t have any automatic deduplication does it? I’m really getting scared that somehow I’ve given googledrive a command to undelete folder1 on user2, and it’s somehow dismantling copy of folder1 on user1 into loose files with no structure and at a rate of about 10,000-20,000 files a day, why on earth are the web interfaces for all these services so unreliable compared to rclone, and why have I ever once been so lazy as to click a button on these web interfaces
edit4: when selecting these files, it turns out they have two locations, their loose vomit location, and then the name of the real directory they should be in, but if I click that directory I don’t have access to it. Presumably because it doesn’t exist anymore. Similarly if I look at user1’s activity log it will list a couple dozen files have been moved by me to the trash. If I click the magnifying glass to move to the folder the file is in, it tells me the location doesn’t exist. Yet it will let me select the file from the recent activity panel. Weirder still logged in as user2 it says the same recent activity that user1 has moved something to the trash. This evening though user1’s drive disk space used has gone from 23.5 to 23.4TB so I’m worried that somehow, going to user2 and clicking “undelete all permanently deleted files” is somehow just deciding to go to user1 and delete them there? presumably for the goal of making them appear in user2’s trash folder? yet also somehow randomly appearing in user1’s root directory loose unsorted, with dual linked directories to the directory they should be in, only that link leads no where and the directory they should be in doesn’t exist… I should note before I deleted all 8tb of files from user2 I used rclone to download and move (not serverside move) those files from user2:crypt to user1:crypt… so… it seems insane and impossible that google has identified those files and is working to undo the move, consider the files were all downloaded and reuploaded separately.
I’ve now got 29,000 and 250GB of loose unsorted files in user1’s googledrive root location, because of course I do. Yet somehow 250GB of new files appearing has made overall google drive disk space used go DOWN by about 250GB (I’d have assumed up) then again maybe that disk space usage is counting against user2 not user1? (keeping mind the actual backup is only around 21TB and the other disk usage was likely in the trash can from deleted partial transfers and such). This is such a nightmare of paranoia that I never want to use googledrive’s website ever again for anything, and the worst part is, I didn’t even need to perform the undelete. An rclone check command verified 100% of everything was 100% fine, I just wanted to see if it would work. I had assumed it would only effect user2, and not user1 at all.
This is important to others, only because this means rclone move, when it detects that a serverside move is possible, it will create the same folder linking nonsense I’ve encountered here, and I’m not sure rclone should even allow a user to perform a google drive serverside move, so unreliable is googledrive’s serverside move command itself.
edit5: in a randomly selected directory I confirmed 57090 out of 57090 files still exist, so that’s ever so slightly reassuring. Maybe moving new files into the trash is just removing old files from the trash (which happened to be larger files) and I absolutely don’t care about my trash at all, since again, before I did anything silly on GD’s website I confirmed user1 had a complete copy of all my files with rclone check.
edit6: over the course of the past 20hours the loose nonsense files have gone from 43000 to 45000 files counted. My new theory though is that in order to undelete things back into user2’s trash bin, the files are being loosely dumped into user1’s root directory, and then copied directly into user2’s trash, but it’s also doing a bugged moveto where it actually just creates a folder link. however google drive trash bins don’t support a proper folder structure so essentially google drive is creating 100s of 1000s of symbolic links from unsorted loose files in a root into non-existent directories in another user’s trash bin… IMO this is a horrific bug, it’ll be continuing for weeks, if not months, and it’ll properly never do anything at all successful or useful. As far as I can tell though user1:crypt is totally undamaged by the massive number of new pointless files in user1’s root directory also of course there’s no progress bar, and no way to cancel this operation. I could delete user2’s existence entirely, which would save me 8$, but I’m worried that’ll make this bug even worse though, even though it might also fix it. Of course the weirdest part is that user2’s disk space usage isn’t changing and never has (not when the files were deleted, and not now that they’re being undeleted).
However the fact that the disk usage on user1’s account has been going up and down, supports my above theories. It’s gone from 23.5TB to 23.4TB to 23.8TB to 23.7TB which supports the fact that user1 is being given new files (not by me) and then it’s loosing those new files (presumably towards user2). So, I think it’s time to cover my eyes with my hands, ignore this entirely, leave google drive’s website entirely alone for a month or so, and pray.
edit7: so, my core crypt folder seems to be fine, unharmed, but these loose files are still appearing… and… well they’re appearing VERY slowly, 1000-2000 files a day, with a goal somewhere around 100,000-500,000 so, yeah, I think I’m just going to try deleting user2, and praying that doesn’t generate an infinite loop, because I don’t want this pointless process to keep running for months on end, forcing me to pay for 2users when really I only need 1 (unless google forces me to upgrade to 5).
It’s a weird estimate though because the first day or so it went through 10,000+ files in such a fashion, now that it’s slowed down I could probably move them manually, but I’ve begun to suspect moving them manually was breaking whatever process it thought it was doing. I guess I’ll wait a couple days then try deleting the user2 account, I’m definitely afraid that doing so could make this process an infinite loop though.