v1.39 - Linux Debian 9
Currently, rclone ignores limits set by the FTP host or admin in regards to concurrent sessions (how many connections can be made at once to say crawl a directory or download). Rclone uses a crawler that cripples any operation by recurrently and en-masse crawling all directories AT ONCE as it encounters them.
In respect to the limit rationing the necessary transfers,etc.
Where can I reproduce it?
pass: (no password, another issue with rclone, it doesn’t accept empty password for FTPs)
The server in question restricts the amount of concurrent sessions to 2, which should logically allocate the transfers to one 1 session and crawling to another, when done crawling, allocate the remaining one for transfering
2018/03/05 17:17:48 ERROR : OST & J-Music/OST’s/Lossy/collect_sd2.m3u: Failed to copy: failed to open source object: open: ftpConnection Login: 421 Not logged in, only 2 sessions from same IP allowed concurrently.
Implement a flag that will restrict amount of ALL concurrent sessions to said number, or add an option to crawl first and download later.