To cache or not cache on GDrive?

I’ve been frustrated with file load times and folder listing times. I have about 10TB of home videos and photos loaded on to GDrive via crypt and I’ve been caching it like this:

  1. gdrive
  2. gcache of gdrive:/media
  3. crypt of gcache:

Cache settings
chunk_size = 32M
info_age = 2d
chunk_total_size = 75G
workers = 10
writes = true

And when I mount I’ve been playing with settings but mostly using:

rclone mount --allow-other --allow-non-empty --vfs-cache-mode writes --vfs-read-chunk-size 100M --vfs-read-chunk-size-limit 2G --log-file logfile.log -v --fast-list secure: ~/mnt/gdrive &

it works but it’s soooooo slow. When I look at the log file (when I use -vv) it looks like it’s trying to download the entire file system of each folder before I can start accessing anything.

I have writes turned on because I want to rename files and move them between folders, etc.

So my questions are: Do I just need to adjust these settings somehow for improved performance? Or is the gcache: unnecessary? Should I just mount the crypt and skip the cache?

You have a bit of settings that don’t really do much together.

If you want to use the cache backend, you don’t need “–vfs-read-chunk-size 100M --vfs-read-chunk-size-limit 2G” as the cache backend does it’s own thing.

Did you make your own client ID/API key?

https://rclone.org/drive/#making-your-own-client-id

What are you seeing in the logs when you hit play?

Cache or non cache really don’t matter that much as there is only a few seconds of difference in start time. I personally do not use the cache.

I think I’m going try bypassing the cache and see how it works for a while.

I did make my own API key, but I still get some 403 errors in the log. I’m not sure if it’s because I’ve reached my upload limit for the day, or if it’s because I am reaching the “Queries per 100 seconds per user” quota with the 1000 limit. (I looked and it doesnt appear you can ask for more quota per user, only per day?)

I’m not hitting “play” per se – I am doing all of my file operations and browsing through the macOS finder, not Plex.

When I open the file in a media player (movist), I get this in the logs:

2019/03/14 11:13:49 DEBUG : : Statfs: 
2019/03/14 11:13:49 DEBUG : : >Statfs: stat={Blocks:4294967295 Bfree:4294967295 Bavail:4294967295 Files:1000000000 Ffree:1000000000 Bsize:4096 Namelen:255 Frsize:4096}, err=<nil>
2019/03/14 11:13:50 DEBUG : movies/movie.mp4: Attr: 
2019/03/14 11:13:50 DEBUG : movies/movie.mp4: >Attr: a=valid=1s ino=0 size=1612303948 mode=-rw-r--r--, err=<nil>
2019/03/14 11:13:50 DEBUG : &{movies/movie.mp4 (r)}: Flush: 
2019/03/14 11:13:50 DEBUG : &{movies/movie.mp4 (r)}: >Flush: err=<nil>
2019/03/14 11:13:50 DEBUG : &{movies/movie.mp4 (r)}: Flush: 
2019/03/14 11:13:50 DEBUG : &{movies/movie.mp4 (r)}: >Flush: err=<nil>
2019/03/14 11:13:50 DEBUG : &{movies/movie.mp4 (r)}: Flush: 
2019/03/14 11:13:50 DEBUG : &{movies/movie.mp4 (r)}: >Flush: err=<nil>
2019/03/14 11:13:50 DEBUG : &{movies/movie.mp4 (r)}: Flush: 
2019/03/14 11:13:50 DEBUG : &{movies/movie.mp4 (r)}: >Flush: err=<nil>
2019/03/14 11:13:50 DEBUG : &{movies/movie.mp4 (r)}: Flush: 
2019/03/14 11:13:50 DEBUG : &{movies/movie.mp4 (r)}: >Flush: err=<nil>
2019/03/14 11:13:50 DEBUG : &{movies/movie.mp4 (r)}: Flush: 
2019/03/14 11:13:50 DEBUG : &{movies/movie.mp4 (r)}: >Flush: err=<nil>
2019/03/14 11:13:50 DEBUG : /: Attr: 
2019/03/14 11:13:50 DEBUG : /: >Attr: attr=valid=1s ino=0 size=0 mode=drwxr-xr-x, err=<nil>
2019/03/14 11:13:50 DEBUG : : Statfs: 
2019/03/14 11:13:50 DEBUG : : >Statfs: stat={Blocks:4294967295 Bfree:4294967295 Bavail:4294967295 Files:1000000000 Ffree:1000000000 Bsize:4096 Namelen:255 Frsize:4096}, err=<nil>
2019/03/14 11:13:50 DEBUG : : Statfs: 
2019/03/14 11:13:50 DEBUG : : >Statfs: stat={Blocks:4294967295 Bfree:4294967295 Bavail:4294967295 Files:1000000000 Ffree:1000000000 Bsize:4096 Namelen:255 Frsize:4096}, err=<nil>
2019/03/14 11:13:51 DEBUG : &{movies/movie.mp4 (r)}: Flush: 
2019/03/14 11:13:51 DEBUG : &{movies/movie.mp4 (r)}: >Flush: err=<nil>
2019/03/14 11:13:51 DEBUG : &{movies/movie.mp4 (r)}: Release: 
2019/03/14 11:13:51 DEBUG : movies/movie.mp4: ReadFileHandle.Release closing
2019/03/14 11:13:51 DEBUG : tuct4rm1gmjhgfmfa904jkr79s/1g2jd9ikc5mvdv2bv26681vudc/1b70k1vu06909f3v45et406rihed9b87n8qvjc9elrnpf8m2l5f0: cache reader closed 540082960
2019/03/14 11:13:51 DEBUG : &{movies/movie.mp4 (r)}: >Release: err=<nil>
2019/03/14 11:13:55 DEBUG : /: Attr: 

So also, when I just loaded that video file, it looks like macOS Finder is probably seeking through everything else in that folder in order to generate a thumbnail/preview, perhaps? The log is a mile long and is basically chunks through every file in the folder.

Along the way it throws some 403 errors:

2019/03/14 10:00:50 DEBUG : : >Statfs: stat={Blocks:4294967295 Bfree:4294967295 Bavail:4294967295 Files:1000000000 Ffree:1000000000 Bsize:4096 Namelen:255 Frsize:4096}, err=<nil>
2019/03/14 10:00:50 DEBUG : pacer: Rate limited, sleeping for 1.689959564s (1 consecutive low level retries)
2019/03/14 10:00:50 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=344093072049, userRateLimitExceeded)
2019/03/14 10:00:50 DEBUG : pacer: Rate limited, sleeping for 2.683367419s (2 consecutive low level retries)
2019/03/14 10:00:50 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=344093072049, userRateLimitExceeded)
2019/03/14 10:00:50 DEBUG : pacer: Resetting sleep to minimum 100ms on success
2019/03/14 10:00:51 DEBUG : tuct4rm1gmjhgfmfa904jkr79s/nna43i0eib7b3if9s4qa5vv9ss/

And you see your API key being used in the cloud console?

Yeah, that’s also possible as some things just are not very cloud friendly as if the player closes the file and opens it many times, it might not be a good use case.

The cache backend might be a better use but depends more on the use case.

Yes, API is definitely being used as I’ve been pushing a lot of content in recently.

I’ll look into seeing if I configure the macOS finder to not generate previews and see if that helps. Maybe switch to a 3rd party file browser as well.

I wonder if more folks have any use case like that as my use is 100% plex and not mounting everyday use on a mac even though I test on my mac here and there.

If you’re okay with not using the mac finder and using a web browser, you could use serve http|webdav instead of a mount. I have a feeling the finder is your issue. Windows explorer had similar issues but those can be disabled in the registry to disable thumbnails and meta.

Update: I’m already seeing a big improvement with the following changes:

Cache config:
chunk_size = 100M
info_age = 2d
chunk_total_size = 75G
workers = 10
writes = true

Mount command:
rclone mount --allow-other --allow-non-empty --no-modtime --log-file logfile.log -vv --fast-list secure: ~/mnt/gdrive &

macOS Finder “View Options” settings
true - show icons
false - show icon preview
false - show preview column

Still using the gdrive > cache > crypt mounting arrangement.

I’m going to try this out for the next few days and update here if I can make any additional improvements

Yeah, cache will handle the open/close instances much better since the chunks are local.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.