Let me start by saying I love rclone, it's an amazing tool. I've been using using it to transfer large amounts of files from my users' Dropbox, Drive, Box etc. to my own S3 Bucket. The way I've been doing it is by having the user pass me an access token, calling "rclone ls" on their files (through a remote I create for them) and then using Promise.all to call "rclone copyto user-drive-remote:path/file1 my-s3-remote:file1-uniqueId" on each file in one go (I use Node).
This is where the problems start. It's been working great with very small quantities of files but when I get a slightly larger folder involved (300mb+) involved it starts giving me a couple of errors:
Drive says that I submitted 735 requests and told me that I had surpassed my daily quota. There are no more than 50 files so this is very weird.
The memory usage gets maxed out. I'm using AWS Lamdba to do this and the maximum memory I can allocate is 3GB. Should I have my lamdba function call other lamdba functions for each file?
I get an MaxListenersExceededWarning. So far I've just kind of hacked it to look like this
require('events').EventEmitter.defaultMaxListeners = 100;
I've attached a gist below if you want to take a look at the code ( minus the sensitive info ):
Any help/tips/advice on this matter would be incredibly appreciated. Would love to hear your feedback. Thanks!!!