What is the problem you are having with rclone?
Hitting limit nearly instant.
What is your rclone version (output from rclone version
)
rclone v1.51.0-114-g472d4799-beta
-
os/arch: linux/amd64
-
go version: go1.14
Which OS you are using and how many bits (eg Windows 7, 64 bit)
Ubuntu 1804 64 bit
Which cloud storage system are you using? (eg Google Drive)
Google Drive
The command you were trying to run (eg rclone copy /tmp remote:tmp
)
if pidof -o %PPID -x **"$0"** ; then
exit 1
fi
LOGFILE= **"/home/hobbits/scripts/logs/rclone-move.log"**
FROM= **"/home/hobbits/gdrivecrypt/"**
TO= **"gdrive:/Plex"**
# CHECK FOR FILES IN FROM FOLDER THAT ARE OLDER THAN 15 MINUTES
start=$(date + **'%s'** )
**echo** **"$(date "** +%d.%m.%Y %T **") Checking for files"** | tee -a **$LOGFILE**
if find **$FROM** * -type f -mmin +15 | read
then
start=$(date + **'%s'** )
**echo** **"$(date "** +%d.%m.%Y %T **") RCLONE UPLOAD STARTED"** | tee -a **$LOGFILE**
# MOVE FILES OLDER THAN 15 MINUTES
rclone move **"$FROM"** **"$TO"** --fast-list --bwlimit 5.375M --delete-after --min-a$
**echo** **"$(date "** +%d.%m.%Y %T **") RCLONE UPLOAD FINISHED IN $(($(date +'%s') - $st** $
fi
exit
A log from the command with the -vv
flag (eg output from rclone -vv copy /tmp remote:tmp
)
18.03.2020 20:32:52 Checking for files
18.03.2020 20:32:52 RCLONE UPLOAD STARTED
2020/03/18 20:32:52 DEBUG : --min-age 15m0s to 2020-03-18 20:17:52.436900107 +0100 CET m=-899.986132962
2020/03/18 20:32:52 DEBUG : rclone: Version "v1.51.0-114-g472d4799-beta" starting with parameters ["rclone" "-vv" "move" "/home/hobbits/gdrivecrypt/" "gdrive:/Plex" "--fast-list" "--delete-after" "--min-age" "15m" "--delete-empty-src-dirs" "--log-file=/home/hobbits/scripts/logs/rclone-move.log"]
2020/03/18 20:32:52 DEBUG : Using config file from "/home/hobbits/.config/rclone/rclone.conf"
2020/03/18 20:33:53 INFO :
Transferred: 0 / 0 Bytes, -, 0 Bytes/s, ETA -
Elapsed time: 0.0s
2020/03/18 20:34:24 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=850951186234, userRateLimitExceeded)
2020/03/18 20:34:24 DEBUG : pacer: Rate limited, increasing sleep to 1.034429042s
2020/03/18 20:34:24 DEBUG : pacer: Reducing sleep to 0s
2020/03/18 20:34:25 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=850951186234, userRateLimitExceeded)
2020/03/18 20:34:25 DEBUG : pacer: Rate limited, increasing sleep to 1.391821499s
2020/03/18 20:34:25 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=850951186234, userRateLimitExceeded)
2020/03/18 20:34:25 DEBUG : pacer: Rate limited, increasing sleep to 2.379802154s
2020/03/18 20:34:25 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=850951186234, userRateLimitExceeded)
2020/03/18 20:34:25 DEBUG : pacer: Rate limited, increasing sleep to 4.830019909s
2020/03/18 20:34:26 DEBUG : pacer: Reducing sleep to 0s
I am trying to decrypt all my drive storage, for a long time it just ran, and every day hit the 750gb limit. I am aware of the limit, but recently it just hits the 750gb limit nearly instantly, like when trying to upload the first file, and i don't upload anything else. And my account is not doing anything else either. If someone could give me a little help here, would be much appreciated.