[URGENT] Analyzing functions never stops

Hello Everyone,

Since yesterday our CI pipelines started to hang and they were stucked on this line during the Sonar cloud analysis.
We have a multi-maven Java project with this configuration:

[INFO] 13:55:14.519 Sensor AWS SAM template file sensor [security] (done) | time=1ms
[INFO] 13:55:14.519 Sensor javabugs [dbd]
[INFO] 13:55:14.519 Reading IR files from: …/target/sonar/ir/java
[INFO] 13:55:14.870 Analyzing 12531 functions to detect bugs.

Command included in the Bitbucket pipelines (running on Bitbucket Cloud):

mvn -P sonarcoverage -Dsonar.login=[valid-user-token] org.sonarsource.scanner.maven:sonar-maven-plugin:sonar

Version used:

This is located on the root project:
image

This is included in the it-coverage-aggregation module:

Can you please give us more insights what can cause this issue?
Thanks in advance

3 Likes

Hi,

We deployed a new analyzer for Java.
We expect that it can detect new kinds of bugs and, of course, that comes with some impact on analysis performance.
Based on our tests, that should be less than 10% for most projects.
We would really interested to know more about your project and to investigate it.

If that’s blocking you, you may use an internal property to disable the new analyzer, its name is sonar.internal.analysis.dbd.
When calling the mvn command, you can add:

-Dsonar.internal.analysis.dbd=false
4 Likes

One of our projects is also timing out also even (after increasing timeout to 90mins) when performing Maven based analysis from CircleCI. Overriding the property worked for us.
If there are any details you would like on the project to help develop the analyzer, please let me know.

Just as input: also gradle in BB pipeline - same issue

When there is a new analyzer, this information should be a prominent part of the log output, along with how to revert. I can tell you that when CI builds stop working and nobody on our end changed anything that would cause it to break, having to post here and hope that someone answers is not a great user experience.

Same issue for our java maven projects. Please inform us as soon as you fixed it

Hello everyone,

Thanks for reporting those performance problems.
If you want to help us investigate:

  • Please share with us what are the memory settings on your CI machine
  • If your project is open source, please put a link to it so that we can directly reproduce the performance problem
  • If your project is private, you could share with us privately the content of ir/java generated folder ( you should see the full path from the analysis logs, something like Reading IR files from /agent/_work/53/s/target/sonar/ir/java)

We are also running into this timeout issue. I can repro this locally as well. Please let me know privately where I should share logs, etc.

Thanks @jnewton03 ! I will contact you privately

Is there any news? We haven’t edited our bitbucket pipelines since a long time ago and suddenly they are hanging because of the analyzing step

Hi @rodergas ,

we are still investigating the performance problem, and we will keep you updated as soon as we have more information.

For the time being, I’d suggest you to use an internal property to disable the new analyzer:

-Dsonar.internal.analysis.dbd=false .

1 Like

cWe are also facing the same issue, we have 11 pull requests initiated by snyk that executed for a default runtime of 360 minutes per pull request and used all our free minutes and subscribed to another 4000 paid minutes and the issue still persisted. Finally we are able to execute it by adding the below command after the mvn -Dsonar.internal.analysis.dbd=false.

What is the root cause of the issue, do we need to continue to apply the above or a fix will be released in the future. How can we report this to Sonar Cloud to fix it.

We have now identified the cause of the problem and are working on a fix. We don’t yet have an ETA, but it won’t be this week. We will post an update in this thread when the fix is ready.

Again we thank you for your reports and your patience.

1 Like

I am also experiencing the same issue since 05/09 and I’m glad I found this thread as I initially thought it was an issue from GitHub build machines or from my code base.

here are the logs showing how loooonnnnnggg it takes :slight_smile:

2022-05-19T07:52:56.2104983Z Analyzing 19167 functions to detect bugs.
2022-05-19T13:18:13.5079766Z Sensor javabugs [dbd] (done) | time=19519202ms

2022-05-19T13:19:23.7984162Z :sonarqube (Thread[Daemon worker,5,main]) completed. Took 5 hrs 34 mins 12.043 secs.

I confirm that this fixed the reported issue and now my build is taking less than 20 mins instead of 5hours

-Dsonar.internal.analysis.dbd=false .

A fix has now been deployed to SonarCloud. We would appreciate if everyone who has been affected by the problem can remove the -Dsonar.internal.analysis.dbd=false option and let us know whether you’re still experiencing any issues.

Thank you again for your reports and your patience.

2 Likes

I just launched a build and it seems to be resolved for me (i.e. my build now takes 20min like weeks ago instead of 5hours)

2 Likes

That’s good to hear. Thanks for letting us know.

I enabled the analyzer and the analysis looks OK now with the same duration. Thanks.

1 Like

I am encountering the same issue. build started taking 4+ hours with it getting stuck trying to analyze tens of thousands of files under sonar/ir/java