Notes:
-database.zip is 3GB in size
-I'm using a service account file with domain-wide delegation, impersonation, and tons of space left in drive,
RESULT:
-I've used this method for years and it's worked, it stopped working the same time it did for everyone else, sometime late last year.
-Sometimes it works on the first try or on retries
-But most of the time it fails and I get the 429 error that tells me "your computer or network may be sending automated queries", just like in the post above.
Wondering if anyone knows if there's a reliable way to get this working again?
This likely won't help based on what you're using already but I thought I'd reply anyway.
Just before Christmas mine started failing again after adding --drive-upload-cutoff 1000T fixed it
I'm uploading multiple files and ended up needing to add random sleep between 5 - 10 mins between each file.
I still get the 429 but rclone's auto retry gets the files that fail up on the 2nd try
This obviously isn't my entire script but its the useful part (linux bash)
files=$(find ${source} -type f)
for i in ${files}
do
rclone -vv --drive-impersonate ${guserdrive} copy ${i} ${dest}${today} --drive-upload-cutoff 1000T --log-file=rclone-${today}.log
x=${?}
errorrclone=$[ ${errorrclone}+${x} ]
rand=$(shuf -i 300-600 -n1 -z)
sleep ${rand}
done
Thanks, appreciate it. Do we have any idea of the longer-term implication, is it realistic to think Google will fix this, or that we could come up with something in rclone?
Just started getting this error, uploading worked perfectly until yesterday. I tried some of the workarounds on the other thread on a mount and it seems to be working fine again. Trying to upload (copy/move) directly still fails most of the time even with the workaround flags.
This is happening on all my boxes at Hetzner, rclone still works like normal on the servers I have on other hosts (not large like Hetzner/AWS/Azure).
My workaround is to use a non-blocked server and have my blocked server as a sftp remote then just rclone copy blocked:/bak/ gdrive:bak/. Its the only reliable way I could come to, hoping they dont get ratelimited too.
Wondering if Goog is starting to roll out more limits. Maybe it only affect certain users so far? While searching I only found the couple threads here that mention the error.
they like block certain limit of file being upload. like file more than 2 or 3gb will be failed. this problem google restrict to aws/azure ip 2month ago. seem like they also do that to herztner ip lol. good luck trying to get it fix
Just in case people doesn't see the other thread; @DeuX came up with the solution here. Simply change your default DNS server to something else than the Hetzner servers.
... just typing something to avoid an error ...