Created thumbnail in emby from files on google drive

Hi,
I'm trying to create thumbnails from all my files which are stored on google drive. My library is about 12TB. Has anyone managed to get an option for rclone which can run 24/7 til it's done? I've been playing with --vfs-read-chunk-size and --vfs-read-chunk-size-limit both set between 1M and 128M. I found that I had most success with setting both to 64M, but still getting banned if I run it to long. Would --vfs-cache-mode full help in any way here?

rclone version:
rclone v1.55.1

  • os/type: linux
  • os/arch: amd64
  • go/version: go1.16.3
  • go/linking: static
  • go/tags: none

OS
ubuntu 20.04

Cloud storage system
Google drive

Current rclon service:
--allow-other
--buffer-size 256M
--dir-cache-time 1000h
--drive-chunk-size 128M
--poll-interval 15s
--timeout 1h
--vfs-read-chunk-size 64M
--vfs-read-chunk-size-limit 64M
--umask 002
--user-agent user
--gid 1000
--uid 1000

hello and welcome to the forum,

how do you generate the thumbnails?

Through emby. It's an option for libraries there. It creates a thumbnail for every 10 seconds in the video. I believe it does it with iframe extraction.

You do not get banned. You hit a quota.

There's no magic here as you to download the whole file to do this. Depending on the speed of your connection, you'll hit a quota at some point so either you just wait it out each day or don't use it.

that would mean that emby would have to download 12TB of data to generate the thumbnails.
on my emby server, i disable that and any scheduled tasks that scan the files for previews and so on.

Yeah sorry. Not banned.

I don't believe it downloads the whole file. If I sett chunk-size to 10M it takes about 3-4 minutes to do one episode. If I watch my bandwidth It pretty consistent at around 25Mbps. The file is 5500MB and the 25mpbs for 4 minutes would only be about 1300MB.

As a one time thing 12TB isn't that bad in my opinion, though if you look at my reply above I don't believe it has to download everything. I now generate thumbnails before the file is uploaded to google drive, so I'm just working on my backlog. It's just a bit irritating not knowing if I will hit the quota suddenly.

Think of it like this. You have to get every 10 seconds of a file. You have to get a range request to get those 10 seconds. In effect, you grabbing the majority of the file if not the whole file each time by requesting every 10 seconds of the file.

The data supports you are hitting the download quota based on what you've shared. Since they are not officially documented on what you are hitting, there's no way around it really other way then letting it do it's thing.

If you want to throttle your download, that's really all you can do to limit the per day consumption.

If it's definitely the download quota i can see that there isn't much to do about that. Guess it's not 10TB for me. I'm nowhere near hitting the API quota so maybe it just is the download. I feel like I'm hitting the that quota a bit early though at like 5-600GB. Is this common?

There isn't anything documented on a download quota so there isn't anyway to tell what you are hitting. Could be a total. Could be per file. Could be per file in a time range. I wouldn't fathom a guess as I've never hit it as I have thumbnails off for that reason as it adds such little value for me.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.