2020/10/30 19:20:59 DEBUG : rclone: Version "v1.51.0-163-ge2bf9145-beta" starting with parameters ["rclone" "rc" "vfs/refresh" "recursive=true" "--rc-addr" "127.0.0.1:5573" "--log-file" "/media/data/logs/scan_gcrypt.log" "-vv"]
2020/10/30 19:25:59 DEBUG : 3 go routines active
2020/10/30 19:25:59 Failed to rc: connection failed: Post "http://127.0.0.1:5573/vfs/refresh": net/http: timeout awaiting response headers
i just realized that its not working anymore. it was working before and i didnt changed my setup for quite a while.
is it possible that i just need to raise --retries int and/or --retries-time as i have much more files on my setup now as in the beginning?
i cant find any false setting in my setup and i am using the beta version to use the "--cutoff-mode=" command
Still not working. Should i switch to log level DEBUG (i am on info right now)
I was switching to cache mode write (from full) and had better performance with it actually. i had some problems on full with my nextcloud crypted storage on my gdrive
ok damn, now i am getting a "error googleapi: Error 403: User Rate Limit Exceeded." - but it should reset in one hour?
need to try it later then but had no exceeded limit before, its because of the tons of api calles i had while trying to fix the rc refresh o.o
edit: but actually i dont get it. i am still streaming plex on my television while getting the error on the scan
i checked the api console and its the hits per 100 second (1.000) which i am hitting while doing the refresh.
setting a high timeout time is doing the trick for me and its just doing some sleep time after hitting the limit and not aborting the command.
so it takes very long for the scan but its not resulting in an error anymore.