If I chose to encrypt some folders on my google drive, can i do this retroactively? I won't have to re-upload 20TBs?
It took me 2-3 weeks to fully scan my google drive on plex. Will I have to redo this after encryption?
If I chose to encrypt some folders on my google drive, can i do this retroactively? I won't have to re-upload 20TBs?
It took me 2-3 weeks to fully scan my google drive on plex. Will I have to redo this after encryption?
rclone does the encryption so it has to download/upload it to get this done.
Assuming you are using the same path names and not changing that, you would not need to rescan on plex as the files haven't changed.
I just found out it's impossible anyway
There is no way for me to download and reupload 20tbs
Why would it be impossible? You got it there. You can always get a Google Compute micro node and just let it fly as it would be pennies since all the traffic is free.
wait what? can you please kind of step by step that for me?
You have to reupload some way since the google drive has no computing power to do the encryption for you.
However, as animosity points out you could use google compute to do this for you. This is basically a small virtual machine located within the google network - and therefore it is very well suited to something like this. Even that will need to reupload the data to drive technically speaking, but within the gogle network you will have basically infinite bandwidth, so it becomes a lot more feasible for such a large amount of data.
I don't know all the details, but it's either free or very cheap depending on your needs (and your needs here are pretty tiny in terms of computing horsepower). If you understand what a virtual machine is then you probably get the gist of how this would function and you only have to research how to set it up.
Yep, I would concur with the others. Use Google Cloud Compute. Network speeds are faster than you can even use...
Look up a guide on how to launch a free tier GCC instance. That won't cost you anything, and has enough power to do what you need.
Just remember as you copy everything, use --bwlimit 8.5M
so that you don't break your Google Drive upload limit. (8.5 will put you very close to the 750GB/day, so adjust accordingly)
You could do the whole 20TB in under a month, for free, for just a little bit of (worthwhile) legwork.
I use my free GCC instance for moving several GB per day with rclone to GDrive, as well as using it as a Pi-Hole I can access from any network (mobile and wifi).
why does it matter how fast i upload if i hit the 750gbs eventually anyway?
Are you saying by limiting the speed of upload google drive somehow lets you upload more than 750?
It's a common misconception in this forum. You can totally upload more than 750GB per day.
All the uploads that have begun will finish. So if you start 1000 uploads with 4GB filesize you can upload 4TB easily. It's only limited by connections allowed by your system and system memory, as each file transfers uses around 16MB of memory.
It gets trickier if you have a lot of small files. Since you can only upload 2 files per second, 750GB will be the limit.
If you have a mixture of small and big files, upload the small ones first using --max-size 10M --max-transfer=600G and then do hundreds of the big files.
that much i know
what im confused about is why would i limit my upload to 8.5 when its the same thing just slower
It shouldn't be a big deal if you leave it uncapped. All that will happen is that you eventually run into the rate-limit and will get errors. Setting a limit is just the "cleaner" way if doing it by avoiding the errors altogether. It can also have the benefit of saving you from having to restart the sync several times. If a sync gets errors for a long time on the same file it will eventually just fail - and you'd have to cover any missed files with at least one extra sync at the end to clean up any that didn't go through. On a very large set of data it may take some non-trivial time to re-check all the files. You could set retries to something very very high to mitigate this though...
But no, you won't get any more data through either way. I am uncertain if hitting the rate limit actually works as a periodic ban (like an API ban is) and thus could actually produce less throughput over a long period versus setting a limit to barely be under the max during 24hrs.. Exactly how these limits are implemented is not documented in detail as google does not tell us. You would kind of just have to test if you are curious. As Kuerbisken says any already started transfers will complete after you hit the rate limit though, but unless all your files are very large that rarely matters much.
IMO using crypt is almost always a good idea. I would not be surprised in the slightest if google runs some hash-checks from time to time to detect abnormally large collections of potentially copyrighted material and cleans them up in batches (which may or may not be applicable to you for your large media archive). It seems like they aren't very aggressive about pursuing this, but 20TB is substantial and probably enough to draw attention if they suspect misuse - and losing 20TB of data would majorly suck.
You'd rate limit to avoid hitting the upload cap each day to less it seamlessly run over a number of days without worrying about the uploads.
If you don't rate limit, it'll will finish fairly fast and just error over and over again and those files that error out won't be copied so you'd have to recopy / sync again to catch the files missed.
From what anyone can tell, the 750GB limit causes you to have to pause uploading for 24hrs. If you don't rate limit, you'll hit that limit, error out, then have to start the process over. Rate limiting saves you time- no question.
You said "There is no way for me to download and reupload 20tbs"
We are giving you a viable solution in how you could feasibly do that. If you do it with the 8.5M rate limit, you could start the process and just let it go on its own. It'll be done within the month, or so. If you don't rate limit, that means you get to log in every so often (daily?) and restart the process after it has errored out.
This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.