SonarQube + postgresql: invalid memory alloc request size

Hi
I’ve scan with sonar-scan (Version 4.2.0.1873-linux) a very large java application and it has finished correctly, but the background task on SonarQube (Versione 8.0.0.29455), show me this error:

Blockquote
Error Details
java.lang.IllegalStateException: Fail to select data of CE task AW5caAgKPYjamDYGVnEh
at org.sonar.db.ce.CeTaskInputDao.selectData(CeTaskInputDao.java:74)
at org.sonar.ce.task.projectanalysis.step.ExtractReportStep.execute(ExtractReportStep.java:65)
at org.sonar.ce.task.step.ComputationStepExecutor.executeStep(ComputationStepExecutor.java:81)
at org.sonar.ce.task.step.ComputationStepExecutor.executeSteps(ComputationStepExecutor.java:72)
at org.sonar.ce.task.step.ComputationStepExecutor.execute(ComputationStepExecutor.java:59)
at org.sonar.ce.task.projectanalysis.taskprocessor.ReportTaskProcessor.process(ReportTaskProcessor.java:81)
at org.sonar.ce.taskprocessor.CeWorkerImpl$ExecuteTask.executeTask(CeWorkerImpl.java:209)
at org.sonar.ce.taskprocessor.CeWorkerImpl$ExecuteTask.run(CeWorkerImpl.java:191)
at org.sonar.ce.taskprocessor.CeWorkerImpl.findAndProcessTask(CeWorkerImpl.java:158)
at org.sonar.ce.taskprocessor.CeWorkerImpl$TrackRunningState.get(CeWorkerImpl.java:133)
at org.sonar.ce.taskprocessor.CeWorkerImpl.call(CeWorkerImpl.java:85)
at org.sonar.ce.taskprocessor.CeWorkerImpl.call(CeWorkerImpl.java:53)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: org.postgresql.util.PSQLException: ERRORE: invalid memory alloc request size 1402166673
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2440)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2183)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:308)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:143)
at org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:106)
at org.apache.commons.dbcp2.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:122)
at org.apache.commons.dbcp2.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:122)
at org.sonar.db.ce.CeTaskInputDao.selectData(CeTaskInputDao.java:67)
… 18 more

I’ve installed SonarQube on a Centos 8 OS with postgresql 10.6.

can someone help me?

Thanks
Carlo

The problem exist because the report scan is bigger than 1GB and the field size limits in postgresql is 1GB(the logical size of any value of a TOAST-able data type to 1 GB, https://www.postgresql.org/docs/10/storage-toast.html).
I can’t understand why sonarqube stores the entire report into the database.

Could be better use Large Objects type (https://www.postgresql.org/docs/10/largeobjects.html) vs bytea.

Hello Carlo,

You can challenge the fact that the report is stored and/or that the supported size is limited to 1Gb on Postgres.

However, in your position, I would first wonder whether it is any good to the performance of the analysis (during scanner run, network transfert, asynchronous processing, …) to have a 1Gb report produced in the first place.

Sources are compressed in the report and analysis data is stored in a size optimized binary format (that roughly the content of the report).

I’m wondering whether the report may be bundling files it shouldn’t.

You could directly impact the size of the report by excluding files from analysis which are actually non meaningful and double check the configuration of source directories (see this doc).

Cheers,

Hi All,

Can you let me know how to fix this issue in Centos 7.8 based postgresql and Sonarqube installation. We cant exclude any files so size of report is higher than 1GB. Please help in increasing size of postgresql size limitation.

I excluded some of the xml files and my report size now it 2G, I am able to see the analysis results, but I am still seeing a failed task created.
Here is the stack trace:

heapsize.txt (2.9 KB)

1 Like