Use a tar archive as a backend

I'd like to be able to use a tar archive as a backend for running 1-way sync to the cloud - i.e. an uncompressed tar being used as a filesystem like store to do a sync up to a cloud drive.

My use case is large data imports which come as archives but which should be restored to an S3-like bucket service as individual files - and I'd like to intelligently skip already present files (i.e. based on size/date/modification etc.)

welcome to the forum,

imo, that is not a realistic, given how most cloud providers work.
in addition, rclone copy entire files, does not support delta copy.
what other cloud copy tools and cloud providers offer tar backend?


tho, i could be wrong.
rclone is open-source, you are welcome to add a tar backend or https://rclone.org/sponsor/

There was already attempt to implemented something similar:

But it is still waiting for somebody to make it happen. It can be you @wrouesnel if it is something you need. I suspect it is extremely niche feature very few people would use.

yeah, but i think it lacks requested feature of sync up to a cloud drive.

might use rclone mount and whatever program(s) that works with tar files.

I saw that work and I think it's targeting a different problem? I have a separate afero based extension of treating a tar file as a filesystem since an uncompressed tar file can be relatively efficiently iterated by skipping through header offsets. This works well enough to read through using fuse.

Sounds like a patch would be accepted here, do I'll see what I can do.

EDIT: the use case is "files in tar" become "a bunch of files on a cloud drive" without needing my users to do more then run "rclone sync tar:somefile s3:some bucket"

How sophisticated are the users? Potentially they could mount the tar file to the filesystem using something like ratarmount and then rclone sync the visible files.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.