I have a commonly accessed, must-read file which is hitting the google drive rate-limit, 403. In an attempt to mitigate, I am hoping that by copying a portion of my remote directory structure to my local folder, I can reduce the overall load of a search and access on the remote by preemptively completing that portion with the local folder.
My setup is a Union consisting of a remote and a local folder ie. gDrive: bin/Local, copying top-level directories and needed files through a max-depth of 2/3 levels would retain the same structure and (I hope!) reduce the amount of drive.get's being performed by allowing the search and access to look only on the 4/5/6/7/8 level depth directories on the remote.
I would test this but I am currently rate-limited and waiting on the expiration (and may be waiting it out by going to bed soon!). I searched the forums but was unable to discover if anything like this has been researched before. Would someone please mind trying this out and reporting back their results?