I am looking at getting a reasonable size collection on gdrive but i moved away from gdrive in the past because plex needed to rescan my folders and hit api limits and download limits as every file touched counted as downloading the whole file.
I was wondering if something like a distributed cache using something like the free tier of firebase realtime db would be useful for people running seperated download and playback computers. If you had a central store of all the file structure that never expires and is updated on change then you would not hit API limits apart from for uploading or downloading files.
It would not suit everyone or every situation but would work well for people running encrypted drives as in that case only the encrypted drive itself could make changes no matter what computer it was.
I am wondering if it is still an issue but i assume it is because guides suggest turning a lot of the features that scan the filesystems for files or you get 24 hour bans and i assume that is due to going over api rate limits.
I was wondering if this was a good idea or not. I was trying to find a way to plug something like this in but i am not a GO developer so i would not try this unless others thought it would be a good idea.