Limit file amount in remote

so basically it just Idea, reverse of chunker to limit amount of remote files repost
recently I already uploaded 350k files to the Gdrive team drive, I am finding a solution to compress these files(like 10-15files compress into one file)
Will it be possible in 2022?
I know I can use union, but I don't want to use it, I want a single drive only
more network overhead is acceptable I think?

Updates are the problem... Are these files fixed or do they get updated.

it is ok, just delete and upload again...
it is acceptable in some cases (not edit at all, just upload new file or delete old file)

I made a prototype zip backend, which would let you upload all your files into one zip file. Such zip files can't be updated though so you can't even change one file in it!

Though you could use --compare-dest to make incremental zips starting from that original zip.

Useful?

upload all your files into one zip file

can you let it be customizable in files amount? for example, I can set how many files are included in one zip file(for better speed)

Such zip files can't be updated though so you can't even change one file in it!

even can't update by completely deleting and compressing and then uploading the zip file again?

in my case, it is just a huge amount of small pictures, and it won't edit, upload new files or delete old files only.

You could do this when uploading by only selecting part of the data I guess.

You can do that of course!

I guess you could use a union backend so you'd upload a lot of files in a big zip (which you can still access individually) but set that as read only, so you would put updates in a new zip or a new directory. Not sure how deletes would work in that scenario.

If you want to have a go with the zip backend it is here: v1.58.0-beta.5990.02faa6f05.zip-backend

Use it like

rclone copy /path/to/files remote:file.zip

And read it back with

rclone lsf remote:file.zip

You can mount the zip, copy individual files out of it, etc.

You could do this when uploading by only selecting part of the data I guess.

I mean, when I have 100 files, I want to divide them into 10 zip files, so that it is 10 zip * 10 files in each zip = 100 files

Not sure how deletes would work in that scenario.
in this case in can just delete a zip file with 10 files only,and reupload

It sounds like what you want is an autozipping remote which uploads small files to zip files but big files on their own, with policy about how many files in a zip.

Not impossible to do, but not something we have at the moment.

small files to zip files but big files on their own

no, just put all of them into zip file, whether it is 50KB or 50MB or 50GB

with policy about how many files in a zip

yes, for better performance, at least I don't need to download all files just for one of them

also, not only zip file, since it just for reduce files amount(in Gdrive), tar is enough!

I found the exact definition, it is file merge, and organized in blocks for sharding, and make an index file to correspondence the file and block, load in memory when accessing files

In that case, is it still an unnecessary feature at the moment?

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.