SonarQube + postgresql: invalid memory alloc request size

I’ve scan with sonar-scan (Version a very large java application and it has finished correctly, but the background task on SonarQube (Versione, show me this error:

Error Details
java.lang.IllegalStateException: Fail to select data of CE task AW5caAgKPYjamDYGVnEh
at org.sonar.db.ce.CeTaskInputDao.selectData(
at org.sonar.ce.task.projectanalysis.step.ExtractReportStep.execute(
at org.sonar.ce.task.step.ComputationStepExecutor.executeStep(
at org.sonar.ce.task.step.ComputationStepExecutor.executeSteps(
at org.sonar.ce.task.step.ComputationStepExecutor.execute(
at org.sonar.ce.task.projectanalysis.taskprocessor.ReportTaskProcessor.process(
at org.sonar.ce.taskprocessor.CeWorkerImpl$ExecuteTask.executeTask(
at org.sonar.ce.taskprocessor.CeWorkerImpl$
at org.sonar.ce.taskprocessor.CeWorkerImpl.findAndProcessTask(
at org.sonar.ce.taskprocessor.CeWorkerImpl$TrackRunningState.get(
at java.base/
at java.base/java.util.concurrent.Executors$
at java.base/
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(
at java.base/java.util.concurrent.ThreadPoolExecutor$
at java.base/
Caused by: org.postgresql.util.PSQLException: ERRORE: invalid memory alloc request size 1402166673
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(
at org.postgresql.core.v3.QueryExecutorImpl.processResults(
at org.postgresql.core.v3.QueryExecutorImpl.execute(
at org.postgresql.jdbc.PgStatement.executeInternal(
at org.postgresql.jdbc.PgStatement.execute(
at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(
at org.postgresql.jdbc.PgPreparedStatement.executeQuery(
at org.apache.commons.dbcp2.DelegatingPreparedStatement.executeQuery(
at org.apache.commons.dbcp2.DelegatingPreparedStatement.executeQuery(
at org.sonar.db.ce.CeTaskInputDao.selectData(
… 18 more

I’ve installed SonarQube on a Centos 8 OS with postgresql 10.6.

can someone help me?


The problem exist because the report scan is bigger than 1GB and the field size limits in postgresql is 1GB(the logical size of any value of a TOAST-able data type to 1 GB,
I can’t understand why sonarqube stores the entire report into the database.

Could be better use Large Objects type ( vs bytea.

Hello Carlo,

You can challenge the fact that the report is stored and/or that the supported size is limited to 1Gb on Postgres.

However, in your position, I would first wonder whether it is any good to the performance of the analysis (during scanner run, network transfert, asynchronous processing, …) to have a 1Gb report produced in the first place.

Sources are compressed in the report and analysis data is stored in a size optimized binary format (that roughly the content of the report).

I’m wondering whether the report may be bundling files it shouldn’t.

You could directly impact the size of the report by excluding files from analysis which are actually non meaningful and double check the configuration of source directories (see this doc).


Hi All,

Can you let me know how to fix this issue in Centos 7.8 based postgresql and Sonarqube installation. We cant exclude any files so size of report is higher than 1GB. Please help in increasing size of postgresql size limitation.

I excluded some of the xml files and my report size now it 2G, I am able to see the analysis results, but I am still seeing a failed task created.
Here is the stack trace:

heapsize.txt (2.9 KB)