Hi, I’m trialing SonarCloud with a big repository ~ 1.2 mil lines of code. This is already post-exclusions and all sorts of filtering the unwanted files.
When I run the Automatic Analysis (integration w/ GitHub) - it always times out due to memory. I decided to run the initial analysis on my computer, push it to sonar and then turn the Automatic Analysis on again (for PRs etc.). The problem is next time there is a code change in the repository, it runs the full analysis again and times out.
Is there a way to tell Sonar to use the “cache” from my previous analysis I ran locally?
Or does our current setup mean we cannot use the automatic analysis?
Actually, every analysis is a full analysis. The initial analysis is longer than later analyses, because of a one-time computation of metadata (mostly git blame). But that’s not memory-intensive, so subsequent analyses are likely to use similar amount of memory as the initial. In short, doing the initial analysis yourself does not help much in your case (it was a nice idea though).
No, there isn’t, yet. Not in the near future.
We may consider increasing the memory available to the service. How much memory does it use when you analyze locally?
High memory use is sometimes a bug in one of the analyzers, and it may be possible to deactivate some specific rules to suppress the problem. We can try to identify the rule in question if you tell us an analysis id that ran out of memory. (Success is not guaranteed, but we can try.)
The ID of the last automatic analysis is AXQssDCMUXDeGoJOT2pe. I’m not 100% sure if insufficient memory is the culprit here (despite the fact the error says Your analysis with ID "AXQssDCMUXDeGoJOT2pe" has failed: the analysis of your project exceeded the maximum memory available for Automatic Analysis. See troubleshooting documentation).
When I ran the analysis locally on my computer, I bumped up the memory quite a bit:
And every time it took over 30 minutes to complete. Since when the analysis is running the message states Analysis in progress, this can take up to 30 minutes. Could that possibly mean that it’s a timeout issue instead?
Autoscan can use up to 7G memory. As you were able to get analysis locally with less than that, it’s strange. Autoscan runs with -XX:+ExitOnOutOfMemoryError -Xmx7g, so I wonder if you drop -XX:MaxPermSize=1024m -XX:ReservedCodeCacheSize=256m, would that reproduce the OOM locally. Autoscan runs on openjdk 11, what’s yours?
The “up to 30 minutes” message is not really accurate, it’s more of a general indication. I checked the logs of the analysis id, and the cause of the failure really is out of memory, by our security analyzer. If you disable rules from the rule repository named “Security SonarAnalyzer”, that should reduce the memory used. (We’re aware of the memory issue with this analyzer, and we’re working on improving it.)
And it was executed successfully (just incredibly slow):
INFO: More about the report processing at https://sonarcloud.io/api/ce/task?id=AXQvK_f_xsYgwSG47qe5
INFO: Analysis total time: 1:20:31.399 s
INFO: ------------------------------------------------------------------------
INFO: EXECUTION SUCCESS
INFO: ------------------------------------------------------------------------
INFO: Total time: 1:20:38.077s
INFO: Final Memory: 72M/250M
INFO: ------------------------------------------------------------------------
I don’t know how it’s possible that with the same JDK and JVM args the analysis runs out of memory on Autoscan (after about 25 minutes), but completes successfully on your PC (80 minutes). I will reach out to coworkers with more knowledge on JVM internals, but this will take some time, due to our holiday schedules. I’ll come back to this when I can.
In the meantime, the only options I see are:
disable rules from the rule repository named “Security SonarAnalyzer”