Hi ppl,
Sorry if this another repeating thread but I've read several posts about Google Drive throttling and I can't figure it out, tried all suggestions I suspect that could be wrong.
I am mounting it and using hb (hashbackup) to save my backups there, but no matter what settings I choose, I get throttled to 166KB/s after 280-300GB uploaded.
I have also checked google API details and there's no limit or errors being hit. Also tried rclone with -vvv and there's no error being logged.
I am using my own client ID.
Thanks
rclone v1.57.0
- os/version: ubuntu 20.04 (64 bit)
- os/kernel: 5.13.0-27-lowlatency (x86_64)
- os/type: linux
- os/arch: amd64
- go/version: go1.17.2
- go/linking: static
- go/tags: none
My latest command-line attempt is:
(removing --checkers or --tpslimit does not solve the problem anyway)
rclone mount \
--user-agent 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.107 Safari/537.36 Edg/92.0.902.55' \
--checkers=12 \
--transfers=8 \
--multi-thread-cutoff=2M \
--multi-thread-streams=8 \
--fast-list \
--cache-dir /mnt/cache/rclone-cache \
--stats 1m \
--drive-chunk-size 8M \
--vfs-read-chunk-size 8M \
--vfs-read-chunk-size-limit off \
--vfs-cache-mode full \
--vfs-cache-poll-interval 20m \
--vfs-cache-max-age 24h \
--attr-timeout 24h \
--dir-cache-time 24h \
--attr-timeout 24h \
--bwlimit 8650k \
--rc \
--rc-web-gui \
--rc-web-gui-update \
--rc-user admin \
--rc-pass admin \
--vfs-cache-max-size 100G \
--retries 34560 \
--retries-sleep 5s \
--tpslimit 5 \
cr1:/ ~/remote/cr1 \
-vvv