How can I tell the difference between the Google api errors? Is this 403 I'm receiving the 750gig upload limit?

Afternoon all,

I’m sharing my seedbox folder which currently only contains 740gig to google drive and about 36 hours ago I hit the api ban with a 403 error. Initially there was a problem with my script which was causing it to run multiple times and resulted in me uploading very large files many times over, therefore I’m not surprised at all that I triggered the error.

Now as I understand it the limitation is 750 gig within a 24 hour space however it has now been 36 hours since I started getting 403 errors and they are still there. Is it possible it’s not the 750 gig limit that is affecting me? and Is there a way of me telling what the problem is?

This is the error I can see in my log:

Failed to copy: googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded

I don’t recall your exact upload command, but I personally limit my uploads pretty low to avoid any 403s or anything else.

/usr/bin/rclone move /data/local/Movies/ gcrypt:Movies --checkers 2 --fast-list --syslog -v --tpslimit 2 --transfers 2

I only do 2 files at a time and limit the tps as well. It takes a little longer, but never 403s on me for uploading too quick.

Are you sure no one else is doing anything if it’s a shared environment?

Thanks @Animosity022

My current script is this:

#!/bin/bash
LOCKFILE="/var/lock/basename $0"

(
flock -n 9 || {
echo “$0 already running”
exit 1
}

/media/dma/craftyclown/bin/rclone copy ~/private/rtorrent/data/ “gdrive:/The Skull/Feral” -v --min-age 1m --log-file=/media/dma/craftyclown/rclone-upload.log

) 9>$LOCKFILE

The transfer limit seems sensible and I will also set a TPS limit. Do you happen to know what the default is, if I don’t set it?

I also have Couch Potato, Sick Rage and Plex scanning the gdrive however I didn’t expect them to affect the upload limits. Or am I mistaken?

also what do checkers 2 and fast list do, if you don’t mind me asking?

EDIT: sorry that’s very lazy of me when I can look them up!

How are they scanning the drive? A mount? If you run ‘rclone’ with no options, it lists out all the defaults and such.

--fast-list                           Use recursive list if available. Uses more memory but fewer transactions.
--checkers int                        Number of checkers to run in parallel. (default 8)

No it’s not a mount. Would you consider that a better option?
As it’s still considered experimental and less reliable I hadn’t yet gone down that route, especially seeing as I am completely new to this and Unix as a whole. I wanted to walk before I could run :smiley:

What are you pointing Sonarr/etc to for your GD if it isn’t mounted?

Apologies I do have a mount but not with Rclone. I’m using Netdrive3 on my home server which Plex, Couch potato and Sickrage all use to access GDrive. I’m only using Rclone on the seedbox to copy my files to GDrive.

Google are a bit secretive about rate limits etc so that is all you get in the way of error message :frowning: