Continued Drive Errors: 429 and "your computer or network may be sending automated queries"

Hi,

This issue has been discussed extensively here, but I've followed the latest tips and can't get Drive to reliably upload a 3GB file to Google Drive.

VERSION:
rclone v1.61.1

  • os/version: Microsoft Windows Server 2016 Datacenter 1607 (64 bit)
  • os/kernel: 10.0.14393.4886 (x86_64)
  • os/type: windows
  • os/arch: amd64
  • go/version: go1.19.4
  • go/linking: static
  • go/tags: cmount

COMMAND:
copy "C:\Data\Backups\database.zip" "Drive:3-RD\database.zip" --tpslimit 1 --drive-upload-cutoff 1000T

Notes:
-database.zip is 3GB in size
-I'm using a service account file with domain-wide delegation, impersonation, and tons of space left in drive,

RESULT:
-I've used this method for years and it's worked, it stopped working the same time it did for everyone else, sometime late last year.

-Sometimes it works on the first try or on retries

-But most of the time it fails and I get the 429 error that tells me "your computer or network may be sending automated queries", just like in the post above.

Wondering if anyone knows if there's a reliable way to get this working again?

Many many thanks to the community!

This likely won't help based on what you're using already but I thought I'd reply anyway.

Just before Christmas mine started failing again after adding --drive-upload-cutoff 1000T fixed it

I'm uploading multiple files and ended up needing to add random sleep between 5 - 10 mins between each file.
I still get the 429 but rclone's auto retry gets the files that fail up on the 2nd try

This obviously isn't my entire script but its the useful part (linux bash)

files=$(find ${source} -type f)
for i in ${files}
do
	rclone -vv --drive-impersonate ${guserdrive} copy ${i} ${dest}${today} --drive-upload-cutoff 1000T --log-file=rclone-${today}.log
	x=${?}
	errorrclone=$[ ${errorrclone}+${x} ]
	rand=$(shuf -i 300-600 -n1 -z)
	sleep ${rand}
done

This has been working reliably for almost a month

Thanks, appreciate it. Do we have any idea of the longer-term implication, is it realistic to think Google will fix this, or that we could come up with something in rclone?

dont hope it will fix. google somehow make some restriction to azure/aws ip,hence why those upload failed. it already almost 2month

Just started getting this error, uploading worked perfectly until yesterday. I tried some of the workarounds on the other thread on a mount and it seems to be working fine again. Trying to upload (copy/move) directly still fails most of the time even with the workaround flags.

This is happening on all my boxes at Hetzner, rclone still works like normal on the servers I have on other hosts (not large like Hetzner/AWS/Azure).

My workaround is to use a non-blocked server and have my blocked server as a sftp remote then just rclone copy blocked:/bak/ gdrive:bak/. Its the only reliable way I could come to, hoping they dont get ratelimited too.

Wondering if Goog is starting to roll out more limits. Maybe it only affect certain users so far? While searching I only found the couple threads here that mention the error.

The error links to this, might be a placeholder for now: "Unusual traffic from your computer network" - Google Search Help

they like block certain limit of file being upload. like file more than 2 or 3gb will be failed. this problem google restrict to aws/azure ip 2month ago. seem like they also do that to herztner ip lol. good luck trying to get it fix

Just in case people doesn't see the other thread; @DeuX came up with the solution here. Simply change your default DNS server to something else than the Hetzner servers.
... just typing something to avoid an error ...