Google Drive + Rclone

I'm new here and trying to understand how rclone work with google drive and serve on website,
Please let me explain what i want to archive and I'm sure there are many nice personalities around here will help me,
My concern is video content and i want to save encoded files on google drive,
What are encoded files?
Im using a ffmpeg script that encode mp4 file into small .ts files and one master.m3u8, this is called HLS Streaming, and ffmepg encode main mp4 file into various video qualities like 720p 360p in my case.
I called this storage server and files are saved in videos/year/month/day/filename/ts chunks and master m3u8 file,
I want to save these files on Google Drive and when any https request hit on my storage server, server get that file from google drive and save in cache for some time and play,
As i have Gsuite account i can mount many Google Drives, is it possible to to divide same folder files to various google drives to upload fast and avoid upload limit?
My files are already in small ts files so do rclone chunks needed or not to make process more fast?
What is data encryption? do i need that? and what will affect encryption on process speed?
Any other suggestion will be really appreciated,

Thank you so much

That sounds like two different rclone uses.

For the first a simple rclone copy will work fine to copy the files to Google Drive.

For the second you'll probably want to either use rclone mount and point a webserver at it, or use rclone serve http to serve the google drive.

In either case which caching options you use will be important!

I do something very similar and I use these options

--read-only
--cache-dir /point/to/cache/dir
--dir-cache-time 1m
--vfs-cache-mode full
--vfs-cache-max-age 168h
--vfs-cache-max-size 30G

You can use different service account to do the upload - each one has a 750 GB/day limit of upload.

I wouldn't bother with chunker in this case.

How big are the files?

@ncw Thank you so much for a very good brief,
Im making a test server to understand things better. Still have some confusions but im sure with time all will be ok,
First of all i have to mount gdrive with server,

Which option will more fast and more reliable? copy file from google drive or use rclone server http?
Big confusion is how vfs work, does it create same file tree that will be on server because every http request will have a file path, in my case my file path woul be domain.com/video/year/month/day/filename.

Upload limit is sufficient with one service account, what i want to achieve to copy files from google drive fastly, if files are are on sevral drives, we can request them and as i think it will move fastly because will have less que on ever drive,

Normally every ts file is less then 1 mb,

one question is there any paid help could be available if really needed?

Thanks you so much

My thread goes very down and im really seeking for help
Thanks again

There is not a paid support option here as it's a volunteer forum that people help out with questions.

Your best bet is to try some stuff out, see if it works for you and if you have any specific use questions, folks can help out and chime in to give some answers.

HI Thanks for your reply,
I already asked few questions but still no luck,

@asdffdsa please

hi,

  • so you have a webserver?
    --- where is that server located, local, virtual machine in cloud or what?
    --- what is the operating system?
    --- what is the type of webserver, caddy, apache or what?
1 Like

Yes im using nginx webserver

Im using a dedicated server with Debian 11

I just installed rclone and now need to to know right config to mount drive on server,

`rclone mount gdrive_mount_crypt: /www/wwwroot/drive --allow-other --cache-db-purge --fast-list --poll-interval 10m`

is this right cmd?

thanks a lot for answering my queries

sure, before i comment on that command, which need a few tweaks

  • post the output of rclone version
  • post the config file, redact/remove id/secret/token/password/etc...
1 Like

rclone v1.57.0

  • os/version: debian 11.0 (64 bit)- os/kernel: 5.10.0-8-amd64 (x86_64)- os/type: linux- os/arch: amd64- go/version: go1.17.2- go/linking: static- go/tags: none

[rClone]
type = drive
scope = drive
team_drive =

please delete the client_id, client_secret and token

1 Like

let's start with something simple, upload a few test files.
for now, we will not use a rclone crypt to encrypt the files.

for the rclone mount command,
--cache-db-purge does nothing based on your config.
--fast-list does nothing on a rclone mount

try this
rclone mount rClone: /www/wwwroot/drive --allow-other --poll-interval 10m --vfs-cache-mode full

1 Like
root@google ~ # rclone mount rClone: /www/wwwroot/drive --allow-other --poll-interval 10m --vfs-cache-mode full2021/11/23 23:36:25 Fatal error: failed to mount FUSE fs: fusermount: exec: "fusermount": executable file not found in $PATH

getting this error

just installed fuse and its ok now, and as i can see drive is mounted now, but it will mounted until putty is open when i will close putty it will dismounted, what is solution for this?

uploaded few file and they are moving to gdrive right away

1 Like

this working for me.
pointed a domain on mounted drive and everything looks working
what is next?

you tell me....

1 Like

what about cache and encrypt data?
what is --fast-list ?
any other recommendations to make perfect setup?