Using Google Drive as a cache drive

What is the problem you are having with rclone?

I don't have enough space on my hard drive on my computer, I'm trying to put the google drive where I have 5TB to be able to use it as a cache drive for 7z and then use that drive to upload the file.

Run the command 'rclone version' and share the full output of the command.

rclone v1.65.1
- os/version: Microsoft Windows 10 Pro 22H2 (64 bit)
- os/kernel: 10.0.19045.3930 (x86_64)
- os/type: windows
- os/arch: amd64
- go/version: go1.21.5
- go/linking: static
- go/tags: cmount

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

One command to set the drive as a cache drive
rclone.exe mount --vfs-cache-mode full --local-no-sparse Drive:/ S:

Another command to place the drive for upload (same drive with a different drive name)
rclone.exe mount --vfs-cache-mode full --local-no-sparse --cache-dir s:/ Drive:/ Z: -P

A command for 7z to make it use s: as a working cache drive
7z.exe a -mx0 -p -wS:\ -mhe=on S:\M.7z D:\m

where -wS:\ specifies the S drive for a location to store the cache files in

Please run 'rclone config redacted' and share the full output

type = drive
client_id = XXX
client_secret = XXX
scope = drive
token = XXX
team_drive =

A log from the command that you were trying to run with the -vv flag

2024/01/23 17:21:38 ERROR :  myfile.7z: vfs cache: truncate: failed to set as a sparse file: DeviceIoControl FSCTL_SET_SPARSE: Incorrect function.

2024/01/23 17:21:38 ERROR : myfile.7z: vfs cache: failed to set as a sparse file: DeviceIoControl FSCTL_SET_SPARSE: Incorrect function.

I doubt it'll work as you need full mode for 7z to work which will require the whole file to be there anyway.

Do you think maybe there is a way I can make it work?
These errors should have been resolved when I added
--local-no-sparse
But it seems to keep showing up.
What do you think?

You'd see for sure with a debug log as I think it's looking for atleast writes to save the file.

If that's the case, you need enough space to store your biggest file and I think that means it does not work in your use case.

I tried adding -vv to all the commands but it created a long and detailed log, I'm not sure what to focus on

If you want to share the log, that's the easiest.

I just want to let you know that I made some changes to the commands from what was described in my original message:

So:
First command for creating a cache drive using Google Drive:

.\rclone.exe mount --vfs-cache-mode full --local-no-sparse Drive:/ s: --log-level DEBUG --log-file=C1.txt

A second command to create a Z drive that uses the S drive I placed in the first command as a cache drive:

.\rclone.exe mount --vfs-cache-mode full --local-no-check-updated --cache-dir s: Drive:/ z: -P --log-level DEBUG --log-file= C2.txt

A command for 7z that uses drive z for cache (because automatically z uses s for its cache) and also for final storage.

.\7z.exe a -mx0 -p -wZ:\1111 -mhe=on Z:\testfile2GB.7z "D:\m\filename.txt"

logs:
command1.txt (1.3 MB)
command2.txt (516.4 KB)

If you are using full, it will require enough space for the whole file to be there as that's what full mode does until it gets uploaded.

Do you have enough space to do that?

I don't have enough space on my local drive, so I want to use Google Drive both as a cache memory and as a final upload drive, since I have unlimited storage space on Google Drive

I found it:

But it talks about continuous copying of a file, I'm looking to first compress the file with 7z and then upload it

I saw that there is also a remote called compress, but it does not support adding a password

Do you know of a way I can still make this happen?

Circling a bit here.

If you want to use 7zip or some tool to compress something, 99% of the time you need writes or full.

if write or full is required, you need enough space to store the biggest file.

You can generally copy files directly to remote from somewhere else without using local space, but in this case, you need somewhere to stage your files.

this command will compress the file(s) and upload the archive to cloud, without using harddrive, without using cache.

7z a dummy -tgzip -so file.ext | rclone rcat gdrive01:zork/zork.gz -vv
DEBUG : rclone: Version "v1.65.0" starting with parameters ["rclone" "rcat" "gdrive01:zork/zork.gz" "-vv"]
DEBUG : Creating backend with remote "gdrive01:zork/"
DEBUG : Using config file from "c:\\data\\rclone\\rclone.conf"
DEBUG : Google drive root 'zork': 'root_folder_id = 0AIYnsu88uXytUk9PVA' - save this in the config to speed up startup
DEBUG : fs cache: renaming cache item "gdrive01:zork/" to be canonical "gdrive01:zork"
DEBUG : Google drive root 'zork': File to upload is small (46 bytes), uploading instead of streaming
DEBUG : zork.gz: md5 = 6241c7721f2825bb29175cd62d1df875 OK
INFO  : zork.gz: Copied (new)

I'm not sure I'm backing up my files that method, albeit it does work.

I really want a local checksum so I can validate things are proper before moving them, but if the files aren't important, that is an option.

me too.

me too,


here is how you can actually verify the tar if you wanted

Moving 5TB up and 5TB back down to validate it seems may work but isn't quite viable I'd imagine.

Thanks a lot for the advice, it doesn't seem to work with Windows but it shouldn't be a big problem for me to switch to Ubuntu for this

Is there a way to make it still work on Windows?

yes, i posted a working example up above.

as for the reason your command failed.
need to check 7z docs for -so

You are using a linux based system like ubuntu or linux so i think it works.
Or did you use Windows to run this command?

rclone v1.65.0
- os/version: Microsoft Windows 11 Pro 22H2 (64 bit)
- os/kernel: 10.0.22621.2715 (x86_64)
- os/type: windows
- os/arch: amd64
- go/version: go1.21.4
- go/linking: static
- go/tags: cmount

Thanks, that's interesting, did you run it through Powershell or through CMD?

yeah, i agree

i used CMD