I'm getting a ton of "Error computing hash: failed to open object to read nonce: bad response: 403: 403 Forbidden" today

I’m getting a ton of “Error computing hash: failed to open object to read nonce: bad response: 403: 403 Forbidden” today.

I’m trying to cryptcheck googledrive, I’m getting like 5-10% of all checks giving this error today, it used to be more like 0.01%-0.1% anyone else having this problem today? is there anyway to automatically retry on these errors? could there be? I used to get at most 1or2 403 errors on a big cryptcheck, now I’m getting hundreds of 403’s on a small cryptcheck (too many to easily manually re-run).

This could be related to other problems I’ve had with googledrive lately, although I hope it’s not. I can’t think of what could be causing this.

edit: hmmm, I used --checkers 2
and anecdotally in my very next cryptcheck had 0 errors.
maybe google is busy today?

I too have seen a lot of 403 Forbidden errors when running cryptcheck in the past couple of days. I was considering opening an issue because the --retries flag seems to be ignored or doesn’t do anything on these errors. I did add a generic --dump=“headers,bodies,requests,responses,auth,filters” to the command one time and I think I recall seeing that the actual error was something to do with being rate limited. The log file was huge though and I didn’t get a chance to parse all the way through it, so I could be wrong. I have written a wrapper script to capture failed files and retry cryptcheck specifically on those files, so thats how I get around it. Would be nice for the --retries flag to handle these errors though.

If you use the latest beta hopefully you’ll get a slightly more informative message.

I’d guess you’ve run out of some kind of quota though, or maybe google have introduced a new quota.

You might want to experiment with --tpslimit 1 say which will slow things right down but may get it to work.

Hmm, the retries flag at the moment is a bit of a blunt tool - it retries the whole lot which isn’t really what you want if you are rate limited.

Note the latest beta will pick the error message up properly (I hope!).

Note also that rclone will be trying each time --low-level-retries times

it would be cool if in the distant future cryptcheck worked like copy and at the end would retry all errors (at least if you used a retry flag to tell it to)
but yeah --tpslimit 1 would probably work, but --checkers 2 or --checkers 1 solved my problem. I absolutely think it must be a quota, but it’s one of those per minute quotas because I didn’t have to wait to solve the problem. I just had to ctrl-c repeat command with --checkers 2 or --checkers 1 (which are similar but different from how --tpslimit 1 works)

Hmm, the retries flag at the moment is a bit of a blunt tool - it retries the whole lot which isn’t really what you want if you are rate limited.

Yeah it would be great if the program didn’t retry & low-level retried those;
403 Forbidden, User Rate Limit & The download quota for this file has been exceeded