Hi guys, just reporting that I had daily 403 errors without hitting the download quota with google drive, and I fixed them using the union remote and
eprand search policy!!
If anyone is having the same issue you need to load-balance the hits between multiple remotes
Rather than doing a help and support, would you mind writing up your use case and the steps you used in a How To post?
My use case is simple: I have a 1 Gbps line and know the average file bitrate of all the files. I want to open as much file as possible while making sure no file drops below X speed.
We assume that our rclone backend will be able to sustain the speeds at all times.
403 errors can be hit without using the download quota if you open the same file multiple times. Use a union with multiple remotes and random search to bypass it.
Anyone that gets to where I am now will know what to do with the information I provided here
This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.