I’m sharing my seedbox folder which currently only contains 740gig to google drive and about 36 hours ago I hit the api ban with a 403 error. Initially there was a problem with my script which was causing it to run multiple times and resulted in me uploading very large files many times over, therefore I’m not surprised at all that I triggered the error.
Now as I understand it the limitation is 750 gig within a 24 hour space however it has now been 36 hours since I started getting 403 errors and they are still there. Is it possible it’s not the 750 gig limit that is affecting me? and Is there a way of me telling what the problem is?
This is the error I can see in my log:
Failed to copy: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded
How are they scanning the drive? A mount? If you run ‘rclone’ with no options, it lists out all the defaults and such.
--fast-list Use recursive list if available. Uses more memory but fewer transactions.
--checkers int Number of checkers to run in parallel. (default 8)
No it’s not a mount. Would you consider that a better option?
As it’s still considered experimental and less reliable I hadn’t yet gone down that route, especially seeing as I am completely new to this and Unix as a whole. I wanted to walk before I could run
Apologies I do have a mount but not with Rclone. I’m using Netdrive3 on my home server which Plex, Couch potato and Sickrage all use to access GDrive. I’m only using Rclone on the seedbox to copy my files to GDrive.