is it possible to skip files wich are rate limited at the source?
iam syncing 4000 files with sizes between 100MB and 2GB. iam not the only one who is syncing from the source so the files get frequently ratelimited. right now i start the syncjob with
-v --bwlimit 8M --checkers 8 --transfers 10 --fast-list --timeout 300s --drive-pacer-min-sleep 100ms --tpslimit 2 --low-level-retries 3
Are you using Google Drive?
Did you make your own API / client iD?
You need to turn down your checkers and transfers to something closer to 10 per second with Google so try maybe 5/5.
Hi, thanks for the reply, yes gdrive and yes im using my own API.
That will be relevant if my destination drive gets rate limited. my source is a drive my friends and i are syncing from.
It’s very relevant, in fact more relevant.
Are you using your own API to connect to him? It is shared? What’s he’s doing at the same time? You can only do 10 per second on either side.
To be specific, you can make retries something ilke 1 and that would skip it.
--retries int Retry operations this many times if they fail (default 3)
It tries 3 times by default.