Cannot scan a large project after upgrade from 6.7 to 8.9 LTS : GC overhead exceeded

Must-share information (formatted with Markdown):

  • which versions are you using (SonarQube, Scanner, Plugin, and any relevant extension)
    8.9LTS previously on 6.7, using SonarQube Scanner 4.2.0
  • what are you trying to achieve
    Scan large ant project which worked with 6.7
  • what have you tried so far to achieve this

We are trying to scan a large ant project using SonarScanner plugin (v4.2.0) in jenkins with SQ 8.9 but I ran into runtime memory issue.
We have sonar properties file specifying configuration to break down the large project into modules.
With the old SQ, each module was analyzed separately, after each module it outputs the number files indexed, but with SQ 8.9, it tries to analyzes the whole project and then fails due to heap size.
“java.lang.OutOfMemoryError: GC overhead limit exceeded”

Please see the attachment for the scanner output.

Also I need to mention I had to remove sonar.branch property since it’s not supported anymore, but that doesn’t seem to be the issue here.

This is snippet of sonar.properties file:

sonar.projectName=trt-all
sonar.projectVersion=build-354
sonar.language=java
sonar.sources=
sonar.modules=batchfeeder,restricted,
sonar.exclusions=…

configuration for each module

batchfeeder.sonar.projectName=batchfeeder
batchfeeder.sonar.sources=/…
batchfeeder.sonar.tests=/…
batchfeeder.sonar.junit.reportsPath=/work/…
batchfeeder.sonar.cobertura.reportPath=/work…

configuration for restricted

batchsystem2.sonar.projectName=restricted
batchsystem2.sonar.sources=/work/production/…
…
sonarlarge.txt (4.8 KB)

By increasing the memory size to 8G, the large project was successfully analyzed and I can see the hyperlink for the uploaded project, but later the quality gate falied, and the link took me to an empty project in Sonarqube.

please see the output on jenkins.


**00:18:24** INFO: Analysis total time: 24:26.263 s **00:18:24** INFO: ------------------------------------------------------------------------ **00:18:24** INFO: EXECUTION SUCCESS **00:18:24** INFO: ------------------------------------------------------------------------ **00:18:24** INFO: Total time: 24:27.627s **00:18:24** INFO: Final Memory: 79M/7778M **00:18:24** INFO: ------------------------------------------------------------------------

**00:18:36** [EnvInject] - Injecting environment variables from a build step. 
**00:18:36** [EnvInject] - Injecting as environment variables the properties file path 'sonar-workaround.properties' 
**00:18:36** [EnvInject] - Variables injected successfully. 
**00:18:36** New run name is '#8-all' **00:18:36** org.quality.gates.jenkins.plugin.QGException: Expected status 200, got: 400. Response: {"errors":[{"msg":"The \u0027component\u0027 parameter is missing"}]} 
**00:18:36** at org.quality.gates.sonar.api.SonarHttpRequester.executeGetRequest(SonarHttpRequester.java:124) 
**00:18:36** at org.quality.gates.sonar.api.SonarHttpRequester.getAPITaskInfo(SonarHttpRequester.java:151)
1 Like

Hi,

FYI, I’ve edited your second post to put the log snipped in line.

Congrats on your upgrade & on working through your initial challenges with post-upgrade analysis. Between 6.7 and now, we dropped the concept of modules, which likely explains the change in behavior you saw.

Regarding your error, it looks like it’s coming from a non-SonarSource Jenkins plugin: org.quality.gates.jenkins.plugin so it’s probably going to be difficult for us to diagnose here in this community. You might start by making sure it’s still compatible with modern versions of SonarQube. Beyond that, we have native integration now for reflecting your Quality Gate status in your Jenkins pipeline, so those docs might be useful.

 
HTH,
Ann

Hi Ann,
Thanks for your reply. As I had mentioned when I follow the SQ link to the project, the analysis is shown as failed:
{errorMessage":“Fail to select data of CE task AX8bW65-kPhQGRLHeDHS”,“hasScannerContext”:

Does this have to do with Quality Gating plugin?
So when you mentioned the break by module feature has been removed, does it mean I need to change my sonar.properties file, or as it is okay?

Hi,

Yes, your error probably has to do with the Quality Gating plugin. I’d try without it and look at the native functionality instead.

Regarding modules, it sounds like you’ve worked through that & don’t need to do anything else.

 
Ann

Thank you! I just found out that the plugin is not supported in 8.9 and one suggested
adding sonar.qualitygate.wait=true ? Is this property supported in 8.9?

Hi,

Yes, the property is supported. See also the docs I linked to earlier.

 
Ann

Hi Ann,
I did remove the quality plugin from the job, and I added the property, but the job failed and no hyperlink to SQ project is shown PLease see the attached file for the logs.
qalityfail2.txt (1.4 KB)

I did another run without qualityGate, this time the hyperlink to sonarqube project was displayed on the job console output, which took me to the analysis resutls, but it is empty and a link to a failed background task is displayed. The error message for the failed ce task is attached. It seem it is a memory issue/limitation with PostGreSQL db. The size of the compressed report in SQ 6.7 was 18MB but in SQ 8.9 is 634MB. It seems handling of the modules in 8.9 is totally different. I read PSQL size limitation for storing objects is about 500MB,
So the question is why the project is not analyzed for each module like before. Any change I need to do in the sonar.properties ( as described above) to get the same result as before.

qalityfail3.txt (3.4 KB)

Hi,

From my reading, the max supported size in Postgres is 1Gb, so I wonder if this is a setting you can tune on the Postgres side.

I think that starts with understanding exactly what changed. It’s not clear to me whether your previous analysis was analyzing each module as a separate project, or all as modules of the same project. Also, given the size difference, it seems quite possible that files that were omitted from analysis before are now included…? Did you have exclusions set server-side that need to be replicated now in this post-module world?

 
Ann

Hi Ann,
My sonar.properties file has it changed at all the exclusion set is still the same.
Does SonarQube store the uncompressed version in database?
Also I need to know more about the server side, when the report gets uploaded that means the project is already stored in db? is that right? the issue I am seeing it seems it is related to querying the large record. Could you give me some info about the schema as where the report gets stored?

Hi,

Based on this answer in an earlier thread

Sources are compressed in the report and analysis data is stored in a size optimized binary format (that roughly the content of the report).

To be clear, before upgrade, the same number of files were being analyzed? And the only difference between before & after is whether/how the modules are / are not recognized by SonarQube?

 
Ann

Hi Ann,
Yes the same number of files are being analyzed before and after. I have the same properties file to specify the modules.
I excluded some file and increased the JVM parameter to 10GB ( -Xmx10G -XX:MaxMetaspaceSize=512m ) the report size shrank to 78MB, I am able to see the analysis results but a there is failed task with the same error ( "Caused by: org.postgresql.util.PSQLException: Ran out of memory retrieving query results.) So definitely this does not seem a file size issue.
This is my setting on the server side.
sonar.web.javaOpts=-Xmx768m -Xms128m -XX:MaxMetaspaceSize=160m -XX:+HeapDumpOnOutOfMemoryError -Djava.net.preferIPv4Stack=true -server

sonar.ce.javaOpts=-Xmx512m -Xms128m -XX:+HeapDumpOnOutOfMemoryError -Djava.net.preferIPv4Stack=true -server

I am going to increase this setings and see, any recommendation?

Hi,

I just checked the default setting for Enterprise Edition (it’s higher since we assume a larger / busier instance with EE) and it’s

#sonar.ce.javaOpts=-Xmx2G -Xms128m -XX:+HeapDumpOnOutOfMemoryError`

Since you’re running into memory issues, I would try that if your hardware will support it.

 
Ann

Thanks, we are running community addition. Finally by increasing the memory I was able to get the 2GB report to be processed with no error.
sonar.ce.javaOpts=Xmx2G -Xms1024

Hi,

Yeah, but we can all still learn from (and use!!!) the EE recommendations. :smiley:

Sorry, I got nothin’

 
Ann

Since it’s working now we don’t need any change on the DB’s setting.
Thanks for your help.