Trying to decrypt my Gdrive

What is the problem you are having with rclone?

Hitting limit nearly instant.

What is your rclone version (output from rclone version)

rclone v1.51.0-114-g472d4799-beta

  • os/arch: linux/amd64

  • go version: go1.14

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Ubuntu 1804 64 bit

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

if pidof -o %PPID -x **"$0"** ; then

exit 1

fi

LOGFILE= **"/home/hobbits/scripts/logs/rclone-move.log"**

FROM= **"/home/hobbits/gdrivecrypt/"**

TO= **"gdrive:/Plex"**

# CHECK FOR FILES IN FROM FOLDER THAT ARE OLDER THAN 15 MINUTES

start=$(date + **'%s'** )

**echo** **"$(date "** +%d.%m.%Y %T **") Checking for files"** | tee -a **$LOGFILE**
if find **$FROM** * -type f -mmin +15 | read

then

start=$(date + **'%s'** )

**echo** **"$(date "** +%d.%m.%Y %T **") RCLONE UPLOAD STARTED"** | tee -a **$LOGFILE**

  # MOVE FILES OLDER THAN 15 MINUTES 

rclone move **"$FROM"** **"$TO"** --fast-list --bwlimit 5.375M --delete-after --min-a$

**echo** **"$(date "** +%d.%m.%Y %T **") RCLONE UPLOAD FINISHED IN $(($(date +'%s') - $st** $

fi

exit

A log from the command with the -vv flag (eg output from rclone -vv copy /tmp remote:tmp)

18.03.2020 20:32:52 Checking for files
18.03.2020 20:32:52 RCLONE UPLOAD STARTED
2020/03/18 20:32:52 DEBUG : --min-age 15m0s to 2020-03-18 20:17:52.436900107 +0100 CET m=-899.986132962
2020/03/18 20:32:52 DEBUG : rclone: Version "v1.51.0-114-g472d4799-beta" starting with parameters ["rclone" "-vv" "move" "/home/hobbits/gdrivecrypt/" "gdrive:/Plex" "--fast-list" "--delete-after" "--min-age" "15m" "--delete-empty-src-dirs" "--log-file=/home/hobbits/scripts/logs/rclone-move.log"]
2020/03/18 20:32:52 DEBUG : Using config file from "/home/hobbits/.config/rclone/rclone.conf"
2020/03/18 20:33:53 INFO  : 
Transferred:   	         0 / 0 Bytes, -, 0 Bytes/s, ETA -
Elapsed time:         0.0s

2020/03/18 20:34:24 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=850951186234, userRateLimitExceeded)
2020/03/18 20:34:24 DEBUG : pacer: Rate limited, increasing sleep to 1.034429042s
2020/03/18 20:34:24 DEBUG : pacer: Reducing sleep to 0s
2020/03/18 20:34:25 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=850951186234, userRateLimitExceeded)
2020/03/18 20:34:25 DEBUG : pacer: Rate limited, increasing sleep to 1.391821499s
2020/03/18 20:34:25 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=850951186234, userRateLimitExceeded)
2020/03/18 20:34:25 DEBUG : pacer: Rate limited, increasing sleep to 2.379802154s
2020/03/18 20:34:25 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=850951186234, userRateLimitExceeded)
2020/03/18 20:34:25 DEBUG : pacer: Rate limited, increasing sleep to 4.830019909s
2020/03/18 20:34:26 DEBUG : pacer: Reducing sleep to 0s

I am trying to decrypt all my drive storage, for a long time it just ran, and every day hit the 750gb limit. I am aware of the limit, but recently it just hits the 750gb limit nearly instantly, like when trying to upload the first file, and i don't upload anything else. And my account is not doing anything else either. If someone could give me a little help here, would be much appreciated.

Is that the whole log? Those are normal 403s as you are just hitting the API too fast. You only get 10 per second so if anything else is using that same info, you'd need to tone it down.

You can reduce down the checkers to maybe 4 and see how that works. It's a bit of trial and error as you want to push the API, but not too hard as you get too many retries.

This is just a log to represent the errors i get, the script ran for like 5 minutes.. I have never used checkers before how do i use them?

That's all listed under here:

https://rclone.org/flags/

      --checkers int                         Number of checkers to run in parallel. (default 8)

so just add

--checkers 4

to your command and see how that works.

I will try to run the script with the checkers flag, if it doesnt work i will come back with a log.

Still filled with errors

Heres a pastebin https://pastebin.com/tVQ4LV0m

Did you setup your own client ID?

My config looks like this


type = drive

client_id = alotofnumbersandletters.apps.googleusercontent.com

client_secret = SECRET

scope = drive

token = {"access_token":"Access token goes here"}

root_folder_id = 0ALmiA5a0IYFnUk9PVA

[gdrivecrypt]

type = crypt

remote = gdrive:/Private

filename_encryption = standard

directory_name_encryption = true

password = PASSWORD

password2 = PASSWORD2```

Did you create your own client ID though or just use the standard rclone one?

I created the client id trough developers.google.com

To give you a little insight to the fix, i just tried to delete my old project and create a new one with new OAuth2 credentials now it works..

Excellent. It did seem you weren't using/the client id was not working. Happy that seemed to solve the issue and I'll remember that.

My next question was to check the API console and see if you were seeing hits, but I guess that was no.

Just to be clear, the mount worked and all that. And i saw hits on the old oauth.. So no idea what went wrong, anyway works now.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.