Zips and rars in gdrive

So you mean that I should be adding that option when I mount using rar2fs? great to know as I have always only used it with the default options for target dir and source dir.

Yes it will basically force rar2fs to check for all rar files in your rclone mount and preload the contents before you try to access the folder, so it will behave way faster as it will remove the inherent latency on cloud storage.
Depending on the amount of rar files you have this might take a while.

after the rclone mount is running, you can pre-cache the folder/file structure in the vfs cache.
no data is download, just metadata.

let's say you have plex media server, with the data in the cloud and now you want plex to do a re-scan.
without the pre-cache, rclone would have to scan the entire remote in the cloud, plex would have to process the files and that would take a long time.
with the pre-cache, the plex scan would go very quickly as rclone has cached the folder/file structure and metadata such as mod-time.
most rcloners do this.

This sounds great, when I was reading the mount options you gave me earlier with a github link, I understood that the person did so with a local host remote control after that fact that the drive is mounted.

Can I add in that line when I am mounting it or am I stuck with rclone rc

not sure what link you mean?

to pre-cache the vfs cache.

  1. add --rc to your rclone mount command.
  2. start the mount
  3. to pre-cache, run rclone rc vfs/refresh recursive=true
    you can run step 3. whenever you need to.
    for example, if i add a new media file to jellyfin, i will do step 3 and then have jellyfin a scan.
    for that scan, jellyfin will not need to download any data from cloud, as it is already in the vfs cache.
    if jellyfin finds a new file, then jellyfin will request rclone download a small amount of data, to create thumbnails and get metadata.
1 Like

So at the end of the day.

Me using the micros over at google costed me trial credit for traffic.

Just use sftp and your vps' upload pool.
That way you don't need the extra unzipping step.

I do not know about rar, but zip allows for random access and can be very efficient on an rclone mount. I've used rclone mount (without any cache) to extract files from a very large zip file.

On the other hand, tar.gz or tar.xz must first decompress the whole file before getting to the specific one you want.

Hi sorry to ask, but do you found the way to unzip file without need to unpack at the local machine?
Can you elaborate more the process ?? thank you

I can see the point you are making, but as my goal was to upload smaller packs so in case my slow connection dies I don't have to restart.
After seeing sftp, I feel like I would be fine most of the time except when I do large files, at that point I will spit it into zips and use my upload pool, gave up being frugal.

Not probs, here at the forums to share my knowledge.

So I did this at the beginning for two purposes, one, save upload quota of vps, two, stable uploading of files.

I planned on packing the million pieces I had into a big zip, then zip part then unzip.

But since I discovered sftp with filezilla I no longer need this remedy. You always need to unpack on either the vps or another vps connected to rclone, not work the sketchy time and writing process.

1 Like

Not whom you asked but I wrote a demonstration about this. But as I said above, this works because zip allows random access whereas compressed tar does not

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.