Need help uploading very large folders (larger than free space)

What is the problem you are having with rclone?

I want to upload quite a few large folders to cloud using rclone. Obviously I need to archive before uploading since they have many files, but I also want relatively good access into the archives so I can't use tarballs. Here's what I've tried so far:

  1. Creating zip / 7z / squashfs archives into an rclone mount w/o vfs-cache. It doesn't work because they all require seek.
  2. Same as 1 but with vfs-cache-mode full/writes. My volume space runs out and no archive is made :frowning:
  3. zip and pipe into rcat, which is supposed to work but doesn't due to info-zip bugs: crashes on symlinks and even when the folder contains no symlinks the resulting files are sometimes incorrect.

So I wonder if it's possible for rclone mount to provide seekability when writing without storing the entire file? I think ncw mentioned something similar a while back: VFS: File bigger than free space - #7 by ncw

Alternatively, does someone know an archiving tool that can stream its output properly and create archives with good random access?

What is your rclone version (output from rclone version)

1.55.0

Which OS you are using and how many bits (eg Windows 7, 64 bit)

CentOS 7.9.2009

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

See above

The rclone config contents with secrets removed.

[gdrive]
type = drive
scope = drive
upload_cutoff = 1G
chunk_size = 1G

A log from the command with the -vv flag

Will provide if needed

hello and welcome to the forum,

for the large folders:
how many total files?
what is the total size of all files?

Thanks for the reply!

There are ~2 million files totaling ~18TB. I have ~800GB allocated space left.

You should upload this using rclone move or rclone copy commands. Dragging the files to the mount isn't ideal for your use-case. It also won't have to copy these files into the vfs cache first, so it won't uses less local storage space. If you're saying you only have 800gb on your cloud storage, there's nothing rclone can do to give you more space or fit 18tb into 800gb.

You're going to need to write a script that will compress your file, then run rclone move on that file. That way there's little disk space needed. You can choose to compress 2 or 10 files at a time, depends on what is most efficient for you.

The only format I know that's good for streaming is CHD, chdman is tool made for disk images like iso and bin/cue files. It's provided by the mame project.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.