Best provider to sync >1TB and thousands of files?

Just wondering what the best provider would be for rclone to backup 1 TB of data and thousands of files daily, with maybe 10 new files and 100 mb added per day. Currently there are some files around 100-500 mb, and a lot of 100-200 kb files.

I was interested in Google drive business account. But upon testing it, I find that syncing 200 files that didn't change takes 10 seconds which I feel is pretty slow. I tried to increase the api quota limit but then said I needed to attach it to a billed account.

Does anyone know how fast rclone can be made for google drive? I am used to using rsync and it seems to be able to sync thousands of files in a few seconds. Or is there a better provider to consider?

Although increasing the quota limit needs to be attached to a billing account, you will not be billed. Below is a screenshot of my quota:

I have no issue syncing/uploading 10 of thousands of small files, notice that my queries per second is 300 which is derived from 30,000/100.

I'm also uploading almost 1TB per day to GSuite Shared drives, but it's all big files. Here is my command for small files: --tpslimit 300 --fast-list -P

I have gsuite account, how do I increase it? do I have to provide a valid reason? tell me about your experience.

How much can you increase it by? And how fast is it compared to rsync?

Did you try the --fast-list flag?

Yes, it didn't help. The main issue seems to be that free oauth clients are limited to 10 operations per second.

If you want to look at other providers

The fastest providers are the ones you pay for transactions on, so AWS S3, GCS, Azure blob but they are expensive.

Reasonable value providers would be B2 or Wasabi which don't charge per transaction and are reasonably priced.

Are you looking to do a backup?

You might want to consider using restic with rclone which will join small files together.

Yes it is for backing up crypto coin prices.

check out wasabi, i have been using them for many years.
wasabi is hot storage and does not have all those limits with transactions/sec and so on.
i have a verizon fios gigabit connection and using rclone i can max out that connection.

I attached a free google cloud account, upgraded it to a paid account as it said that's the only way I can request a quota increase.

How long does it take to get the quota upgrade approved and how much can I increase it to?


How did you get google to increase your quota? They are refusing to increase it for me and gave me a long laundry list of questions to answer.

Thank you for your email. We understand that you would like to further evaluate the request as you can see on your end that you are reaching the caps and getting rate limit error. My name is Angelique from Google Cloud Platform Team and I'm glad to assist you further.

At this point, in order for us to better understand your concern, we would like to ask you the following information below, so that we could relay it to our Engineering team:

1.Will you be hitting 80% of your QPS on a daily basis?
2.What is your current error rate %?
3.The number of users.
4.Average number of requests per day/per user (calculation describing your expected usage of the API).
5.Which API methods will be called and what will be the frequency.
6.Will you be polling the Drive API to check if files have been modified?
7.Have you looked into implementing push notifications[1] to save on quota?
8.Have you implemented exponential backoff[2] for your application?

These are some answers I might give for a mount

  1. Yes
  2. You can get this from your google console
  3. ?
  4. See console
  5. See console for what API methods are called at the moment
  6. No we use changenotify
  7. No it isn't appropriate for this app.
  8. Yes
1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.