Wasabi Upload Many Small Files Many Directories

good luck......

What should The Perfect Log Options be for next rclone operation in case we need to read entrails?

well, these are your options, perfect you can decide...

given the large amount of files, perhaps NOTICE
if you need an entry for each file transferred, then 'INFO`

Thank you; first Production copy shall be --log-level DEBUG and under 10,000 files.

Does rclone config file support setting global (apply to all remotes) option values?

i think that the answer is no.
for example, what setting?


I tried and failed with:

  • access_key_id
  • secret_access_key

This time I had log_level in mind. Current state is fine and thank you.

Do --check-first --fast-list --order-by modtime all play well together?

i have never used --order-by or check-first.

for sure, --check-first and --order-by string should work together as that is documented.
"If you want perfect ordering then you will need to specify --check-first which will find all the files which need transferring first before transferring any."

i see no reason that --fast-list would not play well with that.

if you state what you are trying to do, then perhaps someone can give better advice.

We don't yet know if we will have time to transfer full 4.2 TiB many files set:

  • We are considering partial copy strategies.
  • We already know some of our oldest files have little or no value.
  • We have too many unknowns for an immediate decision; research in progress.
  • Rclone filter & order options may have every partial copy strategy imagined so far.
  • Posing questions before anticipated need is hopefully a schedule accelerator.
  • As copy scenarios come to mind, I quickly post related questions, if any.

Thank you many times over; you in particular and this outstanding community.

perhaps this is an option.

i have a python script, that coordinates all backup needs.
one thing it can do is 7zip a folder and rclone it to cloud.
uploading one large file is much faster then uploading lots of small files.

Violates requirements shared in Post 9:

  • Wasabi bucket is backing data source for BunnyCDN.
  • CDN requires 1:1 mapping from client URL to Wasabi URL.

@asdffdsa Understood and no offense taken whatsoever. :heart_eyes:

There's a good quote


This time is copy test of two large files:

cpino@marble:~$ ls -l /vmnt/xdw-tgz
total 108655216
-rw-rw-r-- 1 tomcat tomcat   4872364520 Sep  8 02:56 xdw-20200908-final-pca1.tgz
-rw-rw-r-- 1 tomcat tomcat 106390568278 Sep  8 07:22 xdw-20200908-final-resources.tgz

The command is:

rclone copy --progress /vmnt/xdw-tgz wasabi-us-east-1:cs2a-backup/file/xdw-tgz

currently outputting:

Transferred:       44.166G / 103.622 GBytes, 43%, 22.767 MBytes/s, ETA 44m34s
Transferred:            1 / 2, 50%
Elapsed time:      33m6.8s
 *              xdw-20200908-final-resources.tgz: 39% /99.084G, 27.711M/s, 36m37s

Linux top shows steady ~33% CPU busy; is this normal?

@asdffdsa @ncw Anything shared will be appreciated. From my last post:

Just being blunt here, but it may be simpler and easier for everyone involved if you just go ahead and pay for a few hours of dedicated time from ncw as a consultant to get all the commands etc. figured out and any questions answered, doubly so if this is as time-critical as you say.

1 Like
  • I don't have authority to pay out company funds.
  • I don't run the company; I am a contractor myself.
  • I asked boss to donate long ago for good will shown.
  • I do contribute my personal time as gratitude in kind.
  • Today looks like a different picture with far less good will.
  • @ncw can deliver time; nobody can guarantee outcomes.
  • I know my boss won't do without binding legal commitments.
  • Worse we don't have much time for drafting legal boilerplate.
  • IMO the more good I can tell my boss is best path for donations.
  • Meanwhile I'd like same support as anyone; no better; no worse.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.