Advice on hasing rclone mount

I am trying to rehash a large amount of stuff that is stored on GD crypt.

it seems like GD interprets the multiple requests for chunks as multiple users and quickly returns errors "The download quota for this file has been exceeded" 403 error

I have been trying many things like setting large timeout numbers (100) but rclone just burns through them with seconds.

Chunk-size has been 32M and 64M with the same result.

Is there anything that I can do to help this?

rclone version is what?

Did you make your own API key/client iD?

https://rclone.org/drive/#making-your-own-client-id

Do you need to use the cache for something or can you just mount directly?

Yes, i use my own API key/client id.

This is just mounting directly. I tested with cache but it was exceptionally slow. Maybe I should try again?

What is your mount command?
Can you run it with a -vv and share the log as that will be the debug log and we can see what's going on.

If you have your own key, it's very unlikely to get 403s in your mount logs.

Unless using an older version. Can you share your version?

I read on one of the man pages about repeatedly requesting chunks causes 403 errors. I have just now seen that there is "--vfs-read-chunk-size" which might help me.

I do not write to the mount so I thought vfs was not relevant, but maybe that one?

Definitely using 1.47. I am trying the cache again now so I don't have the log for now, but this was the command I last used.

/usr/bin/rclone mount bear_crypt: /media/major \
--allow-non-empty \    
--use-mmap \
--allow-other \
--low-level-retries 100 \
--dir-cache-time 96h \
--drive-chunk-size 64M \
--log-level INFO \
--log-file /opt/rclone/logs/rclone.log \
--timeout 1h \
--umask 002 \
--rc

and this is the one I am testing cache with now:

rclone mount bear_cache_crypt: /media/major \
--read-only \
--allow-non-empty \
--allow-other \
--fast-list \
--use-mmap \
--buffer-size 0 \
--cache-workers 16 \
--cache-chunk-path /var/rclone/rclone-cache \
--cache-db-path /var/rclone/rclone-cache \
--cache-chunk-size 64M \
--cache-chunk-total-size 50G \
--dir-cache-time 96h \
--log-level INFO \
--log-file /opt/rclone/logs/rclone.log \
--timeout 1h \
--umask 002 \
--low-level-retries 100 \
--cache-db-purge \
--rc

mmm, these cache settings are not good. Rclone keeps restarting

goroutine 2725 [chan receive]:
runtime.gopark(0x13f1b10, 0xc0ab8b31d8, 0xc00018170d, 0x3)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/proc.go:301 +0xef fp=0xc0ab7f8e38 sp=0xc0ab7f8e18 pc=0x42f4cf
runtime.goparkunlock(...)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/proc.go:307
runtime.chanrecv(0xc0ab8b3180, 0xc0ab7f8f68, 0x1, 0xc0001869c0)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/chan.go:524 +0x2ea fp=0xc0ab7f8ec8 sp=0xc0ab7f8e38 pc=0x405e1a
runtime.chanrecv2(0xc0ab8b3180, 0xc0ab7f8f68, 0xb8)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/chan.go:411 +0x2b fp=0xc0ab7f8ef8 sp=0xc0ab7f8ec8 pc=0x405b1b
github.com/ncw/rclone/backend/cache.(*worker).run(0xc018237f00)
        /home/travis/gopath/src/github.com/ncw/rclone/backend/cache/handle.go:394 +0x7e fp=0xc0ab7f8fd8 sp=0xc0ab7f8ef8 pc=0x9d983e
runtime.goexit()
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/asm_amd64.s:1337 +0x1 fp=0xc0ab7f8fe0 sp=0xc0ab7f8fd8 pc=0x45ca91
created by github.com/ncw/rclone/backend/cache.(*Handle).scaleWorkers
        /home/travis/gopath/src/github.com/ncw/rclone/backend/cache/handle.go:140 +0x160

goroutine 2726 [runnable]:
syscall.Syscall(0x0, 0x28, 0xc4a8000000, 0x4000200, 0x4000000, 0x4000200, 0x0)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/syscall/asm_linux_amd64.s:18 +0x5 fp=0xc0da456c00 sp=0xc0da456bf8 pc=0x4b2dd5
syscall.read(0x28, 0xc4a8000000, 0x4000200, 0x4000200, 0x0, 0xc4a8000000, 0x0)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/syscall/zsyscall_linux_amd64.go:732 +0x5a fp=0xc0da456c58 sp=0xc0da456c00 pc=0x4b06ba
syscall.Read(...)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/syscall/syscall_unix.go:172
internal/poll.(*FD).Read(0xc000081200, 0xc4a8000000, 0x4000200, 0x4000200, 0x0, 0x0, 0x0)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/internal/poll/fd_unix.go:165 +0x131 fp=0xc0da456cb0 sp=0xc0da456c58 pc=0x4c8611
os.(*File).read(...)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/os/file_unix.go:263
os.(*File).Read(0xc0dbd1c050, 0xc4a8000000, 0x4000200, 0x4000200, 0xc4a8000000, 0x0, 0x0)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/os/file.go:108 +0x70 fp=0xc0da456d20 sp=0xc0da456cb0 pc=0x4cffd0
bytes.(*Buffer).ReadFrom(0xc0da456dc0, 0x15e0680, 0xc0dbd1c050, 0x4d6dce, 0xc0003e4c30, 0xc01850ae00)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/bytes/buffer.go:207 +0xbd fp=0xc0da456d90 sp=0xc0da456d20 pc=0x5422dd
io/ioutil.readAll(0x15e0680, 0xc0dbd1c050, 0x4000200, 0x0, 0x0, 0x0, 0x0, 0x0)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/io/ioutil/ioutil.go:36 +0xe5 fp=0xc0da456df8 sp=0xc0da456d90 pc=0x5521c5
io/ioutil.ReadFile(0xc01850ae00, 0xf3, 0x0, 0x0, 0x0, 0x0, 0x0)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/io/ioutil/ioutil.go:73 +0xea fp=0xc0da456e50 sp=0xc0da456df8 pc=0x55236a
github.com/ncw/rclone/backend/cache.(*Persistent).GetChunk(0xc0014d21e0, 0xc0628608c0, 0x1ec000000, 0x15bb900, 0x2, 0x1, 0xc0a7b14e00, 0xc0a7b14ea0)
        /home/travis/gopath/src/github.com/ncw/rclone/backend/cache/storage_persistent.go:478 +0x14f fp=0xc0da456ef8 sp=0xc0da456e50 pc=0x9e293f
github.com/ncw/rclone/backend/cache.(*worker).run(0xc018237f20)
        /home/travis/gopath/src/github.com/ncw/rclone/backend/cache/handle.go:406 +0x10b fp=0xc0da456fd8 sp=0xc0da456ef8 pc=0x9d98cb
runtime.goexit()
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/asm_amd64.s:1337 +0x1 fp=0xc0da456fe0 sp=0xc0da456fd8 pc=0x45ca91
created by github.com/ncw/rclone/backend/cache.(*Handle).scaleWorkers
        /home/travis/gopath/src/github.com/ncw/rclone/backend/cache/handle.go:140 +0x160

goroutine 2727 [chan receive]:
runtime.gopark(0x13f1b10, 0xc0ab8b31d8, 0xc0a7dc170d, 0x3)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/proc.go:301 +0xef fp=0xc000692e38 sp=0xc000692e18 pc=0x42f4cf
runtime.goparkunlock(...)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/proc.go:307
runtime.chanrecv(0xc0ab8b3180, 0xc000692f68, 0x1, 0xc0a7dcc680)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/chan.go:524 +0x2ea fp=0xc000692ec8 sp=0xc000692e38 pc=0x405e1a
runtime.chanrecv2(0xc0ab8b3180, 0xc000692f68, 0xb8)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/chan.go:411 +0x2b fp=0xc000692ef8 sp=0xc000692ec8 pc=0x405b1b
github.com/ncw/rclone/backend/cache.(*worker).run(0xc018237f40)
        /home/travis/gopath/src/github.com/ncw/rclone/backend/cache/handle.go:394 +0x7e fp=0xc000692fd8 sp=0xc000692ef8 pc=0x9d983e
runtime.goexit()
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/asm_amd64.s:1337 +0x1 fp=0xc000692fe0 sp=0xc000692fd8 pc=0x45ca91
created by github.com/ncw/rclone/backend/cache.(*Handle).scaleWorkers
        /home/travis/gopath/src/github.com/ncw/rclone/backend/cache/handle.go:140 +0x160

goroutine 2728 [chan receive]:
runtime.gopark(0x13f1b10, 0xc0ab8b31d8, 0xc00018170d, 0x3)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/proc.go:301 +0xef fp=0xc000695e38 sp=0xc000695e18 pc=0x42f4cf
runtime.goparkunlock(...)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/proc.go:307
runtime.chanrecv(0xc0ab8b3180, 0xc000695f68, 0x1, 0xc000186d00)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/chan.go:524 +0x2ea fp=0xc000695ec8 sp=0xc000695e38 pc=0x405e1a
runtime.chanrecv2(0xc0ab8b3180, 0xc000695f68, 0xb8)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/chan.go:411 +0x2b fp=0xc000695ef8 sp=0xc000695ec8 pc=0x405b1b
github.com/ncw/rclone/backend/cache.(*worker).run(0xc018237f60)
        /home/travis/gopath/src/github.com/ncw/rclone/backend/cache/handle.go:394 +0x7e fp=0xc000695fd8 sp=0xc000695ef8 pc=0x9d983e
runtime.goexit()
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/asm_amd64.s:1337 +0x1 fp=0xc000695fe0 sp=0xc000695fd8 pc=0x45ca91
created by github.com/ncw/rclone/backend/cache.(*Handle).scaleWorkers
        /home/travis/gopath/src/github.com/ncw/rclone/backend/cache/handle.go:140 +0x160

goroutine 2729 [runnable]:
net.(*conn).Read(0xc0dbd1c0b0, 0xc052fca000, 0xda3d, 0xda3d, 0x0, 0x0, 0x0)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/net/net.go:173 +0x274 fp=0xc0007f5610 sp=0xc0007f5608 pc=0x5265d4
net.Conn.Read-fm(0xc052fca000, 0xda3d, 0xda3d, 0xc00104c500, 0xd, 0xd)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/net/net.go:117 +0x4d fp=0xc0007f5658 sp=0xc0007f5610 pc=0x7ea0fd
github.com/ncw/rclone/fs/fshttp.(*timeoutConn).readOrWrite(0xc018332900, 0xc0007f56e0, 0xc052fca000, 0xda3d, 0xda3d, 0x20302a, 0x0, 0x9d42)
        /home/travis/gopath/src/github.com/ncw/rclone/fs/fshttp/http.go:75 +0x48 fp=0xc0007f56a0 sp=0xc0007f5658 pc=0x7e78f8
github.com/ncw/rclone/fs/fshttp.(*timeoutConn).Read(0xc018332900, 0xc052fca000, 0xda3d, 0xda3d, 0xc052fcdcfb, 0x167, 0x10)
        /home/travis/gopath/src/github.com/ncw/rclone/fs/fshttp/http.go:87 +0x8a fp=0xc0007f5708 sp=0xc0007f56a0 pc=0x7e7a0a
crypto/tls.(*atLeastReader).Read(0xc0ab8d0280, 0xc052fca000, 0xda3d, 0xda3d, 0x7342e6, 0xc0000320a0, 0xc0007f57c8)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/crypto/tls/conn.go:761 +0x60 fp=0xc0007f5768 sp=0xc0007f5708 pc=0x631860
bytes.(*Buffer).ReadFrom(0xc00104c5d8, 0x15de740, 0xc0ab8d0280, 0x409f35, 0x11cf680, 0x12cc660)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/bytes/buffer.go:207 +0xbd fp=0xc0007f57d8 sp=0xc0007f5768 pc=0x5422dd
crypto/tls.(*Conn).readFromUntil(0xc00104c380, 0x7fd029b3b270, 0xc018332900, 0x5, 0xc018332900, 0x167)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/crypto/tls/conn.go:783 +0xf8 fp=0xc0007f5828 sp=0xc0007f57d8 pc=0x631ab8
crypto/tls.(*Conn).readRecordOrCCS(0xc00104c380, 0x13f1c00, 0xc00104c4b8, 0x0)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/crypto/tls/conn.go:590 +0x125 fp=0xc0007f59f8 sp=0xc0007f5828 pc=0x62ffa5
crypto/tls.(*Conn).readRecord(...)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/crypto/tls/conn.go:558
crypto/tls.(*Conn).Read(0xc00104c380, 0xc2d8003af9, 0x3ffc507, 0x3ffc507, 0x0, 0x0, 0x0)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/crypto/tls/conn.go:1236 +0x137 fp=0xc0007f5a40 sp=0xc0007f59f8 pc=0x6341a7
net/http.(*persistConn).Read(0xc001097680, 0xc2d8003af9, 0x3ffc507, 0x3ffc507, 0x167, 0x0, 0x0)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/net/http/transport.go:1524 +0x7b fp=0xc0007f5ac0 sp=0xc0007f5a40 pc=0x7140bb
bufio.(*Reader).Read(0xc000343e60, 0xc2d8003af9, 0x3ffc507, 0x3ffc507, 0xc00034d0c0, 0xc00003e070, 0xc00003e000)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/bufio/bufio.go:209 +0x126 fp=0xc0007f5b18 sp=0xc0007f5ac0 pc=0x5470d6
io.(*LimitedReader).Read(0xc018209280, 0xc2d8003af9, 0x3ffc507, 0x3ffc507, 0xc0007f5bb0, 0x42c4ef, 0x8)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/io/io.go:448 +0x63 fp=0xc0007f5b60 sp=0xc0007f5b18 pc=0x4686b3
net/http.(*body).readLocked(0xc0000e3ec0, 0xc2d8003af9, 0x3ffc507, 0x3ffc507, 0x167, 0x0, 0x0)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/net/http/transfer.go:816 +0x5f fp=0xc0007f5bc0 sp=0xc0007f5b60 pc=0x70a73f
net/http.(*body).Read(0xc0000e3ec0, 0xc2d8003af9, 0x3ffc507, 0x3ffc507, 0x0, 0x0, 0x0)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/net/http/transfer.go:808 +0xdb fp=0xc0007f5c10 sp=0xc0007f5bc0 pc=0x70a68b
net/http.(*bodyEOFSignal).Read(0xc0000e3f00, 0xc2d8003af9, 0x3ffc507, 0x3ffc507, 0x0, 0x0, 0x0)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/net/http/transport.go:2330 +0xd3 fp=0xc0007f5c78 sp=0xc0007f5c10 pc=0x7181a3
github.com/ncw/rclone/vendor/golang.org/x/oauth2.(*onEOFReader).Read(0xc018209340, 0xc2d8003af9, 0x3ffc507, 0x3ffc507, 0x167, 0x0, 0x0)
        /home/travis/gopath/src/github.com/ncw/rclone/vendor/golang.org/x/oauth2/transport.go:126 +0x55 fp=0xc0007f5cd8 sp=0xc0007f5c78 pc=0x8151f5
io.ReadAtLeast(0x7fd029b7c860, 0xc018209340, 0xc2d8000000, 0x4000000, 0x4000000, 0x4000000, 0x0, 0x0, 0xffffffffffffffff)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/io/io.go:310 +0x88 fp=0xc0007f5d38 sp=0xc0007f5cd8 pc=0x467f08
io.ReadFull(...)
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/io/io.go:329
github.com/ncw/rclone/backend/cache.(*worker).download(0xc018237f80, 0x204000000, 0x208000000, 0x0)
        /home/travis/gopath/src/github.com/ncw/rclone/backend/cache/handle.go:462 +0x347 fp=0xc0007f5ef8 sp=0xc0007f5d38 pc=0x9d9eb7
github.com/ncw/rclone/backend/cache.(*worker).run(0xc018237f80)
        /home/travis/gopath/src/github.com/ncw/rclone/backend/cache/handle.go:427 +0x15f fp=0xc0007f5fd8 sp=0xc0007f5ef8 pc=0x9d991f
runtime.goexit()
        /home/travis/.gimme/versions/go1.12.4.linux.amd64/src/runtime/asm_amd64.s:1337 +0x1 fp=0xc0007f5fe0 sp=0xc0007f5fd8 pc=0x45ca91
created by github.com/ncw/rclone/backend/cache.(*Handle).scaleWorkers
        /home/travis/gopath/src/github.com/ncw/rclone/backend/cache/handle.go:140 +0x160
2019/05/29 23:24:17 NOTICE: Serving remote control on http://127.0.0.1:5572/
2019/05/29 23:24:18 INFO  : bear_cache: Cache DB path: /var/rclone/rclone-cache/bear_cache.db
2019/05/29 23:24:18 INFO  : bear_cache: Cache chunk path: /var/rclone/rclone-cache/bear_cache
2019/05/29 23:24:19 INFO  : bear_cache: Chunk Memory: true
2019/05/29 23:24:19 INFO  : bear_cache: Chunk Size: 64M
2019/05/29 23:24:19 INFO  : bear_cache: Chunk Total Size: 50G
2019/05/29 23:24:19 INFO  : bear_cache: Chunk Clean Interval: 1m0s
2019/05/29 23:24:19 INFO  : bear_cache: Workers: 16
2019/05/29 23:24:19 INFO  : bear_cache: File Age: 2d
2019/05/29 23:24:19 INFO  : bear_cache: Cache DB path: /var/rclone/rclone-cache/bear_cache.db
2019/05/29 23:24:19 INFO  : bear_cache: Cache chunk path: /var/rclone/rclone-cache/bear_cache
2019/05/29 23:24:19 INFO  : bear_cache: Chunk Memory: true
2019/05/29 23:24:19 INFO  : bear_cache: Chunk Size: 64M
2019/05/29 23:24:19 INFO  : bear_cache: Chunk Total Size: 50G
2019/05/29 23:24:19 INFO  : bear_cache: Chunk Clean Interval: 1m0s
2019/05/29 23:24:19 INFO  : bear_cache: Workers: 16
2019/05/29 23:24:19 INFO  : bear_cache: File Age: 2d

You need to run with -vv and grab a log as requested. Writing to the mount doesn't matter either way.

The cache is another layer as you don't really need it unless you have a use case for it.

--allow-non-empty is generally bad as it allows over mounting and hides thing and wouldn't recommend ever using it.

--fast-list does nothing on a mount.

Before your reply, after I noticed the crashing cache, I changed a few settings to the cache mount and it hasn't crashed for 2 hours. Though it is very slow and uses 75% of my 16Gb of ram!

Log reports a few errors, but not too many.
error (chunk not found 3951034368) response
unexpected conditions during reading. current position: 14213771296, current chunk position: 14193524736, current chunk size: 2248704, offset: 20246560, chunk size: 24M, file size: 14220335751
ReadFileHandle.Read error: low level retry 1/100: unexpected EOF

rclone mount bear_cache_crypt: /media/major \
--read-only \
--allow-non-empty \
--allow-other \
--fast-list \
--use-mmap \
--buffer-size 0 \
--cache-workers 4 \
--cache-chunk-path /var/rclone/rclone-cache \
--cache-db-path /var/rclone/rclone-cache \
--cache-chunk-size 24M \
--cache-chunk-total-size 20G \
--dir-cache-time 96h \
--log-level INFO \
--log-file /opt/rclone/logs/rclone.log \
--timeout 1h \
--umask 002 \
--low-level-retries 100 \
--cache-db-purge \
--rc

Will make your suggested changes and use -vv when this fails.

Do you think this use case justifies cache? Anything to be done about speed? What about --vfs-read-chunk-size on a direct mount?

What is your use case?

I don't personally use cache. Without a log, hard to tell what's going on.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.