Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Too many concurrent requests trigger DoS protection #190

Closed
flit opened this issue Apr 12, 2022 · 0 comments · Fixed by #205
Closed

Too many concurrent requests trigger DoS protection #190

flit opened this issue Apr 12, 2022 · 0 comments · Fixed by #205

Comments

@flit
Copy link
Member

flit commented Apr 12, 2022

CPM is too aggressive in parallelizing the index download. While this is great for index update performance, it can trigger the servers' CDN's DoS protection and the requests are blocked. Only Keil and NXP CDNs seem to have this response (they both use Akamai).

Specifically what happens is that the request queue can result in many concurrent requests targeted at one server. If the number of requests is greater than the maximum allowed per source IP, the requests are blocked and either a 403 or other error is returned. After triggering the DoS protection, typically all new requests from the source IP are blocked for a timeout period.

This is made far worse by #162 and #155, since the responses are not checked and the HTML error page included in the response is saved to the .pdsc index file (and then you get PDSC parse errors, of course).

The solution is to throttle the number of concurrent requests. It would be nice to have a maximum per server domain, but limiting the total number is a good first step that would solve the issue.

A while back I wrote the cmsis-pack-index-monitor script to explore this issue outside of CPM. You can use it to see the problem for yourself and experiment with a maximum number of jobs.

@flit flit changed the title Throttle concurrent requests Too many concurrent requests trigger DoS protection Apr 12, 2022
@flit flit closed this as completed in #205 Jan 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant