Sonar Analysis gets failed with OOM java heap space error with no space left on device


(Yash Brahmani) #1

Must-share information (formatted with Markdown):

  • which versions are you using (SonarQube, Scanner, Plugin, and any relevant extension)
    Sonarqube version 6.7.1
    Sonar scanner

  • what are you trying to achieve
    I am trying to analyse 4500 Cobol programs

  • what have you tried so far to achieve this
    I have successfully analysed 500programs but when I increase the threshold to 4500 then it fails the analysis
    Saying ‘Killed’

I am exporting SONAR_SCANNER_OPTS="-Xmx4096m"

Sonarqube container memory is
How can I resolve this?

(G Ann Campbell) #2


Your situation is unclear to me. What threshold is it that you’re adjusting?


(Yash Brahmani) #3

Hello G Ann,
Hope you are fine.
The situation we have is java heap space out of memory error in sonarqube. We are getting this error in background tasks.
I want to understand the difference between SONAR scanner opts , ceopts, webopts, esopts and how do they consume the RAM. We have a license of 5million lines of code.
We host 220 sonar instances. We want to have good customer experience and hence require the info on how can we manage the memory in sonarqube.


(G Ann Campbell) #4

Hi Yash,

I’m going to start by restating what I understand from this thread so far:

  • You’re analyzing Cobol
  • A job that analyzed 500 files (where each file is a program) succeeds
  • When you pulled a larger file set from the DB, 4500, the processing of the background task failed with a message that started with “Killed”

If I’m correct on all points…

  • It would probably be helpful for you to share the full stacktrace if what’s below doesn’t get you where you need to be.
  • The SONAR_SCANNER_OPTS was a good guess, but they only affect the analysis side, and have no impact on background task processing, which takes place on the SonarQube server.

The other “opts” settings you mention do affect the SonarQube server. If the problem is that background task processing is running out of memory, then adjusting the sonar.ce.javaOpts should help. ce here stands for Compute Engine, which handles background task processing.


(Yash Brahmani) #5

Hello G Ann,
Thank you so much for the above explanation.
I would like to clarify few things here.
We are trying to analyse 4500 Cobol programs via Jenkins.

We resolved the killed issue which I could see in Jenkins console logs. It was because the user was using a wrong docker image template to run the analysis.

We now have the issue wherein we are not able to upload a huge report of 2gb actual size and a compressed size of 724mb.
We are using postgresql database.

Can you let us know in which particular config file and which parameter in need to change in postgresql configuration so that the report gets uploaded to sonarqube successfully?

After googling few website I could see max_packet_allowed parameter for mysql database which can resolve the problem but I am not aware about the above equivalent in postgresql.

Thank you


(G Ann Campbell) #6

Hi Yash,

Can you try allocating more memory to the CE and trying again, please? BTW, you’ll make that change to sonar.ce.javaOpts in $SONARQUBE_HOME/conf/


(Yash Brahmani) #7

Hello G Ann,
I tried with modifying the sonar.ce.javaOpts parameters to the following:
sonar.ce.javaOpts=-Xmx2048m -Xms1024m -XX:+HeapDumpOnOutOfMemoryError

and restarted sonarqube.
Now after analysing i am getting the following error:
java.lang.IllegalStateException: Fail to extract report AWoS_EQegSCCR3S3bWEV from database
at org.sonar.server.computation.task.projectanalysis.step.ExtractReportStep.execute(
at org.sonar.server.computation.task.step.ComputationStepExecutor.executeSteps(
at org.sonar.server.computation.task.step.ComputationStepExecutor.execute(
at org.sonar.server.computation.task.projectanalysis.taskprocessor.ReportTaskProcessor.process(
at org.sonar.ce.taskprocessor.CeWorkerImpl.executeTask(
at org.sonar.ce.taskprocessor.CeWorkerImpl.findAndProcessTask(
at org.sonar.ce.taskprocessor.CeWorkerImpl.withCustomizedThreadName(
at java.util.concurrent.Executors$
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(
at java.util.concurrent.ScheduledThreadPoolExecutor$
at java.util.concurrent.ThreadPoolExecutor.runWorker(
at java.util.concurrent.ThreadPoolExecutor$
Caused by: No space left on device
at Method)
at org.sonar.api.utils.ZipUtils.copy(
at org.sonar.api.utils.ZipUtils.unzipEntry(
at org.sonar.api.utils.ZipUtils.unzip(
at org.sonar.api.utils.ZipUtils.unzip(
at org.sonar.server.computation.task.projectanalysis.step.ExtractReportStep.execute(

I am running sonarqube within a container.
Can you let me know what do i need to do next?


(G Ann Campbell) #8



It seems that we’re back to


(Yash Brahmani) #9

Hello Ann,
Thanks for your response but a 2gb report is something which is unacceptable.
I would like to know what is the default report size which we can suggest our users to upload to sonarqube?
Is is possible that a user can analyse the project in two chunks for example 500 programs are getting analysed easily without any changes in the ce opts params.
So is there a way where 4500 programs can be divided and analysed easily?

Yash Brahmani

(G Ann Campbell) #10

Hi Yash,

If you find a report that size unacceptable, then yes you need to divide and conquer. My experience setting up Cobol analysis at my previous company tells me that each program/file is pretty much independent, and so groupings are at least somewhat arbitrary. (Yes, I understand that there are probably sets of programs that interact.)

So by all means, divide your 4500 programs into subsets and analyze them independently. You’ll just want to make sure you give each subset a unique sonar.projectKey value.


(Yash Brahmani) #11

Thank you for your valuable response Ann…
I will get back to you incase I have any questions.
This ticket stands resolved.