The original set of files needed from the gdrive directory is much smaller than the directory itself, but large enough to not want to run Rclone a couple thousand times. This is still true half way through the transfer, as well.
I was more asking what is generating the file list?
You’d finish much much quicker looping through it many times than trying to let it run this way.
I did something easy like
for i in `cat from`; do rclone copy GD:tt/$i /home/felix/zz -P; done
I was trying to understand why you can’t filter/regex it and do it that way rather than dumping a file list. filtering handles it much better.
Honestly, it never occurred to me to use filters instead of files-from. I still think the files-from is a bug, and I’m wondering if I should open an issue on Github.
All that being said, include-from works perfectly, and starts pulling files almost immediately. Thanks!
To answer an earlier question, The original include-from file was generated dynamically from a secondary database.
I’d probably open an issue on it as it doesn’t seem to be working well with large number of files.
Ah, I think this is related to
There is a tentative fix in that thread, but I was trying to think of a better one!