Background Task failed after Sonar-scanner analysis

Our project analysis took 21 hours to complete due to huge codebase, and below was seen in log.

INFO: Analysis report generated in 12644ms, dir size=3.5 GB
INFO: Analysis report compressed in 89957ms, zip size=619.5 MB
INFO: Analysis report uploaded in 19600ms
INFO: ANALYSIS SUCCESSFUL, you can browse http://127.0.0.1:9000/dashboard?id=
INFO: Note that you will be able to access the updated dashboard once the server
INFO: More about the report processing at http://127.0.0.1:9000/api/ce/task?i
INFO: Analysis total time: 21:46:06.827 s
INFO: ------------------------------------------------------------------------
INFO: EXECUTION SUCCESS
INFO: ------------------------------------------------------------------------
INFO: Total time: 21:46:07.577s
INFO: Final Memory: 7M/88M

After this analysis completion, the background task failed with below error

java.lang.IllegalStateException: Fail to select data of CE task AX4pBNt1dxBRvUfJjYnO
	at org.sonar.db.ce.CeTaskInputDao.selectData(CeTaskInputDao.java:74)
	at org.sonar.ce.task.projectanalysis.step.ExtractReportStep.execute(ExtractReportStep.java:66)
	at org.sonar.ce.task.step.ComputationStepExecutor.executeStep(ComputationStepExecutor.java:81)
	at org.sonar.ce.task.step.ComputationStepExecutor.executeSteps(ComputationStepExecutor.java:72)
	at org.sonar.ce.task.step.ComputationStepExecutor.execute(ComputationStepExecutor.java:59)
	at org.sonar.ce.task.projectanalysis.taskprocessor.ReportTaskProcessor.process(ReportTaskProcessor.java:81)
	at org.sonar.ce.taskprocessor.CeWorkerImpl$ExecuteTask.executeTask(CeWorkerImpl.java:212)
	at org.sonar.ce.taskprocessor.CeWorkerImpl$ExecuteTask.run(CeWorkerImpl.java:194)
	at org.sonar.ce.taskprocessor.CeWorkerImpl.findAndProcessTask(CeWorkerImpl.java:160)
	at org.sonar.ce.taskprocessor.CeWorkerImpl$TrackRunningState.get(CeWorkerImpl.java:135)
	at org.sonar.ce.taskprocessor.CeWorkerImpl.call(CeWorkerImpl.java:87)
	at org.sonar.ce.taskprocessor.CeWorkerImpl.call(CeWorkerImpl.java:53)
	at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)
	at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: org.postgresql.util.PSQLException: ERROR: invalid memory alloc request size 1238920861
	at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2553)
	at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2285)
	at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:323)
	at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:481)
	at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:401)
	at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:164)
	at org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:114)
	at org.apache.commons.dbcp2.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:122)
	at org.apache.commons.dbcp2.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:122)
	at org.sonar.db.ce.CeTaskInputDao.selectData(CeTaskInputDao.java:67)
	... 20 more

My instance details

  • Sonarqube version - 9.0.0.45539
  • Scanner version - 4.6.2.2472
  • DB - postgres 12.9
  • Java- 11.0.11

Hey there.

Long story short, your analysis report is about 100MB too big (it can be, at maximum, 512 MB with a Postgres database).

I would refer to a post here where advice is given on shrinking the analysis report:

Another option would be to split the scan into several projects, if there’s a logical way of splitting the sources (a monorepo, for example, with many different applications)

1 Like

Yeah i have tried it by splitting the scan into several sub modules. It works, but i was looking for results in a single scan.
Also you mean that report upload issue is due to postgres DB limitation rather than sonarqube ?

This is a Postgres limitation and as far as I’m aware, cannot be overriden by any DB level setting :frowning:

1 Like

Ok. Since this is a Postgres limitation, should we fallback to H2 database ? Does H2 database also has a similar limitation ? :neutral_face:

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.