OutOfMemoryError since moving to SonarQube Community Build 25.12.0.117093

  • SonarQube Community Build 25.12.0.117093

  • SonarQube deployed by zip

I have been running the analysis of our projects every night for several years. I recently upgraded from the november release to the december release. Since then our largest project is failing to be analysed. It is blocking on an OutOfMemoryException.

[INFO] 99% analyzed
[INFO] 99% analyzed
[INFO] 99% analyzed
[INFO] 99% analyzed
[ERROR] [stderr] Exception in thread "Report about progress of Java AST analyzer" java.lang.OutOfMemoryError: Java heap space
[ERROR] [stderr] 	at org.sonar.java.ProgressMonitor.run(ProgressMonitor.java:77)
[ERROR] [stderr] 	at java.base/java.lang.Thread.runWith(Unknown Source)
[ERROR] [stderr] 	at java.base/java.lang.Thread.run(Unknown Source)
[INFO] Slowest analyzed files (batch mode enabled):
    jet.phoenix.base/src/main/java/jet/phoenix/ui/task/invoice/InputInvoiceNut3.java (28699ms, 309828B)
[INFO] Did not optimize analysis for any files, performed a full analysis for all 8082 files.
[ERROR] Error during SonarScanner Engine execution
java.lang.OutOfMemoryError: Java heap space
	at org.sonar.java.model.location.InternalPosition.atOffset(InternalPosition.java:40)
	at org.sonar.java.model.InternalSyntaxToken.<init>(InternalSyntaxToken.java:47)
	at org.sonar.java.model.JParser.createSyntaxToken(JParser.java:495)
	at org.sonar.java.model.JParser.firstTokenIn(JParser.java:454)

I have tried increasing the memory using the arguments in the configuration file sonar.properties.

sonar.ce.javaOpts=-Xmx4096m -Xms512m -XX:+HeapDumpOnOutOfMemoryError

This has not changed anything.

I have also tried adding the sonar.scanner.excludeHiddenFiles=true attribute but hat has not helped either.

There was no significant change to our codebase at the same time that could explain this OOM.

I am not quite sure where to go from here.

Hi,

That tunes the memory settings on the server side. You need to tune the settings on the analyzer size. You didn’t mention which scanner you use, and it varies slightly. Here’s help for SonarScanner for Maven and for SonarScanner for Gradle.

 
HTH,
Ann

Hi Ann,

I am using the SonarScanner for Maven piloted from Jenkins.

It is unclear to me where the export of the environment variable should be done : export SONAR_SCANNER_JAVA_OPTS=“-Xmx512m”

I have tried Adding that to a shell script executed before the build. But this has had no effect. Presumably it is not running in the same shell process.

I have also tried adding this to the maven command line that is to be executed : -Dsonar.scanner.memory=8192 -Dsonar.java.opt=“-Xmx4G -Xms2G -XX:+HeapDumpOnOutOfMemoryError”

Again this does not seem to have much effect.

Hi,

Can you share your pipeline?

 
Thx,
Ann

Hi Ann,

This is an old freestyle project that has evolved over the years. So I do not think I can show a pipeline. I can share a screenshot of the “Invoke top-level Maven targets” configuration if that helps?

Hi,

Yes, please refresh my memory on the inputs there. :sweat_smile:

 
Thx,
Ann

Hi Ann,

This seems to match what is described in the dosumentation you sent me : Jenkins configuration

Not sure where to configure the memory settings.

Tahnks

Hi,

Thanks for the screenshot. Use the ‘JVM Options’ field.

 
HTWorks,
Ann

Hi Ann,

I added -Xmx4G -Xms1G in JVM Options.

I now see this in the logs : [INFO] MAVEN_OPTS= -Xmx4G -Xms1G. So the argument seems to be used at runtime.

Unfortunately I still get an out of memory error at exactly the same point :

After the “[INFO] 91% analyzed” log things slow down enormously and eventually I get the out of memory error after a few 99% logs.

When I view the system processes there are many processes owned by the jenkins user that are runnnig close to the 4G limit, but even if I increase the limit to 8G they do not use more. There are lots of processes owned by the sonarqube user but they hover around 800Mb, 500Mb, 130Mb, so nowhere close to the 4Gb limit. All these processes have different limits, but I have no idea which one is going over the limit. And how to change the limit for that process.

None of the memory settings I have been changing have had any effect at all on the symptoms.

Hello Daniel ,

This is similar to something i faced , so its worth a try …..

The dataflow bug detection rules consume too much memory . most probably a bug, to rule this out you can deactivate these rules in your profile and retry analysis. you can identify these rules by searching ‘dataflow bug detection’ in the ‘Repository’ section of the rules.The rule prefix is 'javabugs’.

1 Like

Hello Manish,

Thanks for your reply. Unfortunately I do not have any ‘dataflow bug detection’ rule. I don’t think the javabugs rules are available in the community edition.

I am still trying to figure out which process is running out of memory so I can verify that the memory arguments are being passed on to that process.

Kind regards,

Daniel

Hi Daniel,

I’m not sure why I didn’t ask this before - new year, new ideas - can you enable debug logging (add a -X Maven option) and bump the memory allocation some more?

Either that will give us tons of logging and work, or it will give us tons of logging and a better idea of the failure point.

 
Ann

Hi Ann,

I hope you hade a good break.

I added the -X argument, still not sure what process I need to add more memory to.

This is what is happening just before the error :

[DEBUG] 'jet.phoenix.base/src/main/java/jet/phoenix/ui/task/invoice/InputInvoiceNut3.java' generated metadata with charset 'UTF-8'
[INFO] 99% analyzed
[INFO] 99% analyzed
[INFO] 99% analyzed
[INFO] 99% analyzed
[INFO] Did not optimize analysis for any files, performed a full analysis for all 8084 files.
[DEBUG] Cleanup org.eclipse.jgit.util.FS$FileStoreAttributes$$Lambda/0x00007f325035c658@382c90c2 during JVM shutdown
[ERROR] Error during SonarScanner Engine execution
java.lang.OutOfMemoryError: Java heap space
	at org.eclipse.jdt.internal.compiler.ast.ReferenceExpression.copy(ReferenceExpression.java:125)
	at org.eclipse.jdt.internal.compiler.ast.ReferenceExpression.cachedResolvedCopy(ReferenceExpression.java:974)
	at org.eclipse.jdt.internal.compiler.ast.ReferenceExpression.isCompatibleWith(ReferenceExpression.java:1257)
	at org.eclipse.jdt.internal.compiler.lookup.PolyTypeBinding.isCompatibleWith(PolyTypeBinding.java:42)
	at org.eclipse.jdt.internal.compiler.lookup.Scope.parameterCompatibilityLevel(Scope.java:5060)
	at org.eclipse.jdt.internal.compiler.lookup.Scope.parameterCompatibilityLevel(Scope.java:5019)
	at org.eclipse.jdt.internal.compiler.lookup.Scope.computeCompatibleMethod(Scope.java:864)
	at org.eclipse.jdt.internal.compiler.lookup.Scope.computeCompatibleMethod(Scope.java:804)
	at org.eclipse.jdt.internal.compiler.lookup.Scope.getConstructor0(Scope.java:2473)
	at org.eclipse.jdt.internal.compiler.lookup.Scope.getConstructor(Scope.java:2436)
	at org.eclipse.jdt.internal.compiler.ast.Statement.findConstructorBinding(Statement.java:555)
	at org.eclipse.jdt.internal.compiler.ast.AllocationExpression.resolveType(AllocationExpression.java:504)
	at org.eclipse.jdt.internal.compiler.ast.LocalDeclaration.resolve(LocalDeclaration.java:402)
	at org.eclipse.jdt.internal.compiler.ast.LocalDeclaration.resolve(LocalDeclaration.java:258)
	at org.eclipse.jdt.internal.compiler.ast.Statement.resolveWithBindings(Statement.java:503)
	at org.eclipse.jdt.internal.compiler.ast.ASTNode.resolveStatements(ASTNode.java:692)
	at org.eclipse.jdt.internal.compiler.ast.Block.resolve(Block.java:143)
	at org.eclipse.jdt.internal.compiler.ast.Statement.resolveWithBindings(Statement.java:503)
	at org.eclipse.jdt.internal.compiler.ast.ForStatement.resolve(ForStatement.java:445)
	at org.eclipse.jdt.internal.compiler.ast.Statement.resolveWithBindings(Statement.java:503)
	at org.eclipse.jdt.internal.compiler.ast.ASTNode.resolveStatements(ASTNode.java:692)
	at org.eclipse.jdt.internal.compiler.ast.Block.resolveUsing(Block.java:154)
	at org.eclipse.jdt.internal.compiler.ast.TryStatement.resolve(TryStatement.java:1126)
	at org.eclipse.jdt.internal.compiler.ast.Statement.resolveWithBindings(Statement.java:503)
	at org.eclipse.jdt.internal.compiler.ast.ASTNode.resolveStatements(ASTNode.java:692)
	at org.eclipse.jdt.internal.compiler.ast.AbstractMethodDeclaration.resolveStatements(AbstractMethodDeclaration.java:734)
	at org.eclipse.jdt.internal.compiler.ast.MethodDeclaration.resolveStatements(MethodDeclaration.java:386)
	at org.eclipse.jdt.internal.compiler.ast.AbstractMethodDeclaration.resolve(AbstractMethodDeclaration.java:633)
	at org.eclipse.jdt.internal.compiler.ast.TypeDeclaration.resolve(TypeDeclaration.java:1446)
	at org.eclipse.jdt.internal.compiler.ast.TypeDeclaration.resolve(TypeDeclaration.java:1575)
	at org.eclipse.jdt.internal.compiler.ast.CompilationUnitDeclaration.resolve(CompilationUnitDeclaration.java:661)
	at org.eclipse.jdt.internal.compiler.Compiler.process(Compiler.java:811)

It is interesting that Sonar says it has finished the analysis of all the files. And has an OOM error after that. 8084 is about right for the number of files analysed, so I believe it is quite possible that it has actually arrived at the end. What is happening after the analysis to cause the problem? Do you need the full file (3Mb)?

Kind regards,

Daniel

Hi Daniel,

This is very interesting. It looks like it fails during cleanup. It’s quite possible the full log will be needed, but I’m going to flag this for the team and let them request it if they need it.

 
Ann

I have just updated to the latest version : Community Build v26.1.0.118079

Unfortunately I have exactly the same error. It was worth a try.

Kind regards,

Daniel

1 Like

Hi Daniel,

Thanks for trying. This is in queue for the team. We had our annual company off-site last week, so they’re probably a little backed up.

 
Ann

I have just updated to the latest version : Community Build v26.2.0.119303

Unfortunately I have exactly the same error. Any news on when the team may be able to look into this?

Kind regards,

Daniel

Hi Daniel,

This is still queued. Sorry I can’t give you an ETA.

 
:frowning:
Ann

Hello Daniel,

I’ve been looking into this, and I’d like to dig deeper into why the analysis is failing at the final stage. To help me move forward with the investigation, would you mind sharing a few more details about your setup?

It would be very helpful if you could provide:

  • The full analysis log: Even if the file is large, the complete output could be useful for pinpointing the issue.

  • The command and environment: Knowing the exact command used to run the analysis, along with any environment variables (like MAVEN_OPTS or SONAR_SCANNER_OPTS), would really help me understand the context.

Thank you for your help and your patience!

Best regards,

Romain

Hello Romain,

Here is the latest log file.

#751.txt (3.0 MB)

I am calling sonar from my Jenkins installation. Here is the configuration of the build :

Other than the JDBC configuration the only option I have changed in the sonarqube configuration is :

All these changes in the memory configuration have had no effect. It always runs out of memory after 99% of the analysis. Though I do usually get the message :

[INFO] Did not optimize analysis for any files, performed a full analysis for all 8084 files.

Which seems to indicate the analysis has finished.

The progress does seem to slow down massively towards the end, which does seem to suggest that it is strugling with memory and the garbage collector is working hard. But I have not been able to diagnose that any further as I do not know where to look. I can see many java processes running on the machine, none of them reach the 4Gb limit I set in the configuration.

I have been running the analysis on this code base for many years. The number of files analysed has shrunk from about 12000 to 8000. The number of lines of code has shrunk from 1.5 million to just over a million.

Not sure what else I can provide. If helpful I can be available over Teams (or other meeting apps).

Kind regards,

Daniel