Hi guys, just reporting that I had daily 403 errors without hitting the download quota with google drive, and I fixed them using the union remote and eprand search policy!!
My use case is simple: I have a 1 Gbps line and know the average file bitrate of all the files. I want to open as much file as possible while making sure no file drops below X speed.
We assume that our rclone backend will be able to sustain the speeds at all times.
403 errors can be hit without using the download quota if you open the same file multiple times. Use a union with multiple remotes and random search to bypass it.
Anyone that gets to where I am now will know what to do with the information I provided here