It seems to never reply or get any data.
The weird thing is that listing and syncing some other folders of the same site from the same computer works great and fast.
Getting the folder listing in my internet browser just takes a few seconds.
As using --http-no-head has some drawbacks other solution could be to speed up everything by increasing massively number of checkers (they will work in parallel):
time rclone ls --config /dev/null --http-url https://r2u.stat.illinois.edu/ubuntu/pool/dists/jammy :http: --checkers 256
944 main/bioc-api-package_0.1.0-1.2204.1_all.deb
295922 main/r-bioc-a4core_1.52.0-1.ca2204.1_all.deb
1054580 main/r-bioc-affxparser_1.76.0-1.ca2204.1_amd64.deb
...
19188 main/r-cran-ztype_0.1.0-1.ca2204.1_all.deb
479532 main/r-cran-zvcv_2.1.2-1.ca2204.1_amd64.deb
38452 main/r-cran-zyp_0.11-1-1.ca2204.1_all.deb
49048 main/r-cran-zzlite_0.1.2-1.ca2204.1_all.deb
real 0m20.311s
user 0m6.017s
sys 0m4.347s
20s for 25k files is acceptable I think.
If not then 1024 checkers finish job in 8s for me:)
Be warned that if you keep listing stuff like this all the time your IP can be blacklisted by their systems. Of course depends what they have setup but such massive flood of requests looks very similar to DDoS of some sort:)
It would make sense to list all these content ones and then use it locally - maybe you can use rclone mount and increase dir-cache-time to "forever". Something like 9999h. At least this is how I would work with such source.