SonarQube analysis takes a random amount of time regardless of the altered files

Must-share information (formatted with Markdown):

  • which versions are you using (SonarQube, Scanner, Plugin, and any relevant extension) : 9.9 LTS Enterprise
  • how is SonarQube deployed: zip, Docker, Helm : Azure App Service PremiumV3 (P3v3)
  • what are you trying to achieve : Make our analysis faster

I’ve noticed that sonarqube analysis takes a random amount of time. For example, if we have less files (3-4) in a pull request, it still takes half an hour, however in some runs, I observe 50-100 file changes yet it only takes half an hour. which is strange. I assume that with less file changes, the run should be quick. However, it appears that file numbers are unimportant. Is it because it’s scanning all files instead of incremental changes/files? What’s the good practice to have only incremental file scans in consideration?

1 Like

Hey there.

Where is the time taking place? Scanner-side (in your CI, executing the rules against your code), or server-side (processing the analysis report being sent by the scanner to SonarQube)?

At sonar scanner side. There is nothing special item in scanning which taking longer time. It’s all over scan time…

What language(s) are you analyzing?

.net code

Thanks!

If you’re using the latest version of the SonarScanner for .NET – you will be able to take advantage of new caching features to prevent files from being analyzed unnecessarily (particularly unchanged files)

Beyond that, I can suggest this guide for diagnosing performance issues: