Folder with millions of files

Yeah, I’m inclined to write some code to move things around and update the database. If all those files were spread around a few hundred thousand folders, would that be easier and less resource-hungry to migrate/backup using your tool? (BTW, thank you very much for your work! Really useful code!)

I tried to wait but since the network usage graphic indicated to me that no calls were being made, I assumed waiting longer would be useless. If it was an issue with insufficient memory, the command would always crash, right? (as it happened before and I upgraded the EC2 and created some swap memory)

Yeah, I’m using EC2 for this.

I like this approach but you said I should see some output immediately, right? I’m into this more than 1 hour and not a single line was written to the file. The system is idle as if nothing is happening.

This is a one-off.