On Monday, April 24th we’ll adjust rate limits on some SonarCloud APIs in order to ensure that we can continue to operate the service smoothly and with optimum performance.
There should be no impact on routine analysis.
The primary impacts will be on scripts that rapidly make repeated API requests. Unfortunately, users who share a proxy with such scripts may also be impacted in their use of the SonarCloud UI. If you find yourself in this situation, please wait a few minutes and try again. If you are the owner of an impacted script, consider adding some sleeps to your loops.
Per my prior post is the actual rate limit documented? For example GitHub documents their API rate limit as well as how to check the current usage and limits. It would be quite useful to know the limits so that we can take the information into account when designing our internal tools.
The rates aren’t currently documented, but it’s an excellent point and we’ve put it on the list. So you know, the changes we’re implementing are in response to some abuse, and it’s likely that we’ll further adjust them going forward. That said, if you can keep it to under 1k requests per minute, you should be good.
I’m 100% supportive of rate limits, a lot of engineers do forget that SaaS is backed by real hardware even it abstracted by containers, clouds, or other scaling solutions.
At the moment our API calls to SonarCloud are only triggered once a week, and they should only be a few dozen calls, maybe a few hundred, total, per run.