Analysis Takes More Than 3 Hours for an XL Project on Azure DevOps Pipeline

  • versions used: SonarQube EE v9.3.0, Sonar Scanner on Azure DevOps (latest)
  • error observed:
2022-02-16T23:18:57.1896132Z [command]E:\Agents\Agent1\_work\_tasks\SonarQubePrepare_15b84ca1-b62f-4a2a-a403-89b77a063157\4.28.0\classic-sonar-scanner-msbuild\SonarScanner.MSBuild.exe end
2022-02-16T23:18:57.3998675Z SonarScanner for MSBuild 5.5.3
2022-02-16T23:18:57.4006397Z Using the .NET Framework version of the Scanner for MSBuild
2022-02-16T23:18:57.4559576Z Post-processing started.
2022-02-17T02:21:05.4433730Z INFO: Analysis total time: 3:01:45.163 s
2022-02-17T02:21:05.4528836Z INFO: ------------------------------------------------------------------------
2022-02-17T02:21:05.4531851Z INFO: EXECUTION SUCCESS
2022-02-17T02:21:05.4533021Z INFO: ------------------------------------------------------------------------
2022-02-17T02:21:05.4534013Z INFO: Total time: 3:01:51.853s
2022-02-17T02:21:11.4567487Z INFO: Final Memory: 3043M/8192M
2022-02-17T02:21:11.4568169Z INFO: ------------------------------------------------------------------------
2022-02-17T02:21:12.9159603Z The SonarScanner CLI has finished
2022-02-17T02:21:12.9165186Z 05:21:12.914  Post-processing succeeded.
2022-02-17T02:21:12.9548379Z ##[section]Finishing: Run Code Analysis
  • steps to reproduce:
  1. Trigger an analysis for the XL Project (> 5.6M LoC) in Azure DevOps
  2. The “Run Code Analysis” step in the Build Pipeline takes more than 3 hours

Hi @aozmez. Could you share the verbose logs for the “Run code analysis” step please

To generate verbose scanner logs, set sonar.verbose=true as an Additional Property in the “Prepare” step.


Hi @duncanp, I am sharing two ZIP files containing the latest Run Analysis log generated with “sonar.verbose=true” setting.
Thanks. (2.9 MB) (1.6 MB)

Thanks @aozmez.

The problem seems to be in the security analysis rules, which are taking nearly two-thirds of the time.

I’ve asked someone from the Security analyzer team to take a look.

Thank you.
I will be waiting for their response.


After some private discussion with @aozmez , I was able to reproduce part of the issue. It seems that there are two different problems in play here:

  1. According to the logs, it takes ~40 minutes to do a simple deserialization step for the UCFGs before taint analysis. This is something I could not reproduce, and it seems it’s caused by slow I/O on the build server. Looking at the other parts of the log, it takes ~25 minutes to compress 668 MBytes for the final report, which is quite a long time. We’ve had cases before where on Windows machines the I/O during the analysis was extremely slow, due to antivirus software scanning every file. I would suggest to try disabling any antivirus solution (including Windows Defender) which might be running on the build server during the scan.

  2. There is also one performance problem on our side, which causes a slow pre-processing of the deserialized UCFGs before running taint analysis. I was able to reproduce this, and find its root cause. I’ve created a ticket to track this problem on our side and we’ve set the fix to our roadmap for SonarQube 9.4.