[FTP] rclone ignores concurrent session limit

Current version/platform
v1.39 - Linux Debian 9

Issue?
Currently, rclone ignores limits set by the FTP host or admin in regards to concurrent sessions (how many connections can be made at once to say crawl a directory or download). Rclone uses a crawler that cripples any operation by recurrently and en-masse crawling all directories AT ONCE as it encounters them.

Expected behaviour
In respect to the limit rationing the necessary transfers,etc.

Where can I reproduce it?
FTP: 90.188.39.6
login: anonymous
pass: (no password, another issue with rclone, it doesn’t accept empty password for FTPs)

The server in question restricts the amount of concurrent sessions to 2, which should logically allocate the transfers to one 1 session and crawling to another, when done crawling, allocate the remaining one for transfering

log exerpt
2018/03/05 17:17:48 ERROR : OST & J-Music/OST’s/Lossy/collect_sd2.m3u: Failed to copy: failed to open source object: open: ftpConnection Login: 421 Not logged in, only 2 sessions from same IP allowed concurrently.

Proposed solution
Implement a flag that will restrict amount of ALL concurrent sessions to said number, or add an option to crawl first and download later.

1 Like

You can control the concurrency of rclone with the --transfers and --checkers flags. Try --transfers 1 --checkers 1 and see if that helps.