Help needed to fork rclone sync to google drive

Hi,

I have around 3 TB of backup data which will be synced to google drive.
There are millions of files in this backup data along with huge amount of folders..
Is there any way to fork rclone sync process so that multiple rclone sync processes runs at the same time and upload to google drive will be faster.

Please note i also use --backup-dir option for incremental backups.

So the challenge isn't rclone as it can run as many transfers parallel as you want.

The problem is the Google Drive API you can only create 2-3 files per second so many small files is particularly bad for Google Drive API in general.

The goal would be to use something in between to bundle things up such as a restic/borg/Arq/duplicati or something if backup is the goal.

If the goal is to sync, you are somewhat stuck inside Google's API limits unfortunately.

Animosity is right on the money here. There are some significant limits to how fast you can create/modify files on Gdrive (about 2/sec). On many small files this will be a significant limiting factor on throughput.

If you really have "millions" of files that is a LOT of small files, even for 3TB.
I would highly recommend you consider archiving together some of the worst folders if it's data you probably don't need to access individually anyway and you mostly want to store. Transfering a handful of larger archives will be orders of magnitude faster than individually copying hundreds of thousands of files.

You should also be aware that Gdrive has a limit of 400.000 files unless you request more from google, so "millions" would certainly be an issue in any case. Google says performance is also best if you can stay within 100.000:
https://support.cloudhq.net/google-team-drive-technical-specifications-and-limitations/

Hopefully we will get some features in the near future that can automatically bundle together tiny files in a transparent way and greatly improve performance as well as make the file-limit largely irrelevant. (a compression remote is already in the works although currently it does not do bundling of small files)

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.