When I originally started with Rclone I had one Gsuite user and uploaded all of my media through a crypt to that drive, staying under the 750GB/day limit this took some time. I have since added users 2-5 to my Gsuite account. Is it possible to move my TVShows directory from User 1 to say User 2, and modify my MergerFS mount to mount my Temp Local Directory/User 1 Gdrive media/User 2 Gdrive TVShows? Would that move require abiding by the 750GB/day limit? The folder is ~50TB so that will take forever if so. The goal is to be able to have 750GB/day upload for all other media and 750GB/day for TVShows.
Moving files with the API hits the daily limits.
If you can use the Drive Web page and move that route, it would not. Perhaps you can add to a drive and copy or something or just use a team drive.
I use a crypt mount so using the web page is likely out of the equation as I won't know what I'm moving. Maybe I misunderstand team drives? I thought that meant using the drives of other users?
In GSuite, you can make a team drive and give users access to it. It can all in the same GSuite or external or whoever you want.
If the goal is to move it all, you can use the WebUI and just move it.
If you want certain folders, you'd have to use the show mappings in crypt to backtrack and move them that way. If the goal is to avoid the 750GB a day movement, the WebUI would work I believe or just go with a team drive. The team drive does take some time to populate when moving large amounts of data. I was testing it out myself and so a bit of patience is key.
So can I make my existing drive a team drive? I wouldn't need to move anything then. Then for my daily uploads from temp to Gdrive just have a script execute as one user and move ~700GB, then the next user, and the next through all 5 users on my Gsuite?
Sorry as the naming has changed a bit as it's a "Shared Drive" now.
Once you have a shared drive, you can basically copy or cut and drop stuff in it. Like I said, it does take some time to populate depending on size.
I'd surmise you can upload 750 per user to the shared drive. I'm unsure if there is a size limit on a shared drive, but there is a number of files limit per shared drive.
Is there a way to list the total number of files and folders within a directory so I can see how close I am to the 400,000 limit? I saw an old thread that stated rclone size only lists files, not folders.
On the mount in Linux, first is directories and the second is files.
felix@gemini:/GD$ find . -type d | wc -l
felix@gemini:/GD$ find . -type f | wc -l
I'm just over 160,000 so a shared drive will be usable. Now I just need to decide if it's worth the trouble of trying to move everything to a shared drive to remedy what is hopefully a temporary problem. Before I moved to the Gdrive I had strict file size limits in Sonarr which resulted in some not so great releases being grabbed. I put the size limits back to default and re-monitored a bunch of shows/seasons and it's upgrading 100's of episodes per day without searching for them and I can't seem to get the upload to catch up at 750GB/day.
Thank you for the help.
This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.