[Kinda solved] File disappears when it reaches 12G?

Hi!

I want to use rclone mount to store Backupfiles in my Gsuite (each up to 500G big) but when I safe them to an rclone mount, the files disappears in the filesystem when it reaches 12G size.

In the debug log I still see it is written:

Terminalserver/C_VOL-b003.spf.tmp (rw)}: Write: len=131072, offset=201897492482
(thousands of these lines)

When I check the VFS cache folder I see it:

20G Mar 22 12:37 C_VOL-b003.spf.tmp

and it grows:

 30G Mar 22 12:43 C_VOL-b003.spf.tmp

But even if I let it finish the backup, it never reappears, just sits there.

I use rclone 1.40 on Ubuntu 16.04. the data is stored in a ZFS tank.

My settings are:

[gdrive-account1-cache]
type = cache
remote = gdrive-account1:/Backups
plex_url = 
plex_username = 
plex_password = 
chunk_size = 10M
info_age = 5m
chunk_total_size = 600G

I start it this way:

rclone mount gdrive-account1-cache: /tank/backups/export/Account1 --umask 000 --allow-other --attr-timeout 1s --cache-db-path /tank/backups/cache/Account1/db --cache-dir /tank/backups/cache/Account1/ --cache-chunk-path /tank/backups/cache/Account1/cache-backend --cache-tmp-upload-path /tank/backups/cache/Account1/uploads --cache-tmp-wait-time 24h --cache-writes --vfs-cache-mode writes --vfs-cache-max-age 24h --buffer-size=600G -vv

–buffer-size was my last try, it did not change the behaviour.

I do not use crypt, the backup files are already encrypted.

Am I missing something? My goal is that the files remain on the local-disk for 24h and then they get uploaded.

Hmm, looks like it reappears when it’s fully written to uploads, but then my backup was no longer looking for the file (to rename .spf.tmp to .spf) and stopped already.

Maybe it’s an idea to show it in the mount all the time, so the copy time is not a problem.

For now I solved it by splitting the file down to 10G Parts.

As an aside even if you do fix this issue, I think it’s smart to have 10G parts, else if you have any network issues and an upload is broken you won’t have to upload the whole file from the beginning.

You’re right, it makes the handling much easier.

This backup is small (140G), but the biggest one is 500G currently…

I only make one full and then incrementals (joining these daily, weekly, monthly).