Level=error msg="error waiting for container: unexpected EOF"

Hi all,

I’m trying to perform a Sonar Cloud validation in my code using Bitbucket pipeline, but I’m getting the error message bellow:

 level=error msg="error waiting for container: unexpected EOF"

I tried to increase memory using the size: 2x and using parameter SONAR_SCANNER_OPTS="-Xmx2048m", but I still receive the same error message.

Please, could you help me?

I’m using the following script in bitbucket-pipelines.yml:

image: python:3.7.2  # Choose an image matching your project needs

clone:
      depth: full              # SonarCloud scanner needs the full history to assign issues properly

definitions:
      caches:
            sonar: ~/.sonar/cache  # Caching SonarCloud artifacts will speed up your build
      steps:
      - step: &build-test-sonarcloud
              name: Build, test and analyze on SonarCloud
              caches:
              -            # See https://confluence.atlassian.com/bitbucket/caching-dependencies-895552876.html
              - sonar
              size: 2x
              script:
              -            # Build your project and run
              - pipe: sonarsource/sonarcloud-scan:1.0.1
      - step: &check-quality-gate-sonarcloud
              name: Check the Quality Gate on SonarCloud
              size: 2x
              script:
              - pipe: sonarsource/sonarcloud-quality-gate:0.1.3

pipelines:                 # More info here: https://confluence.atlassian.com/bitbucket/configure-bitbucket-pipelines-yml-792298910.html
  branches:
    master:
      - step: *build-test-sonarcloud
      - step: *check-quality-gate-sonarcloud
  pull-requests:
    '**':
      - step: *build-test-sonarcloud
      - step: *check-quality-gate-sonarcloud

  • ALM: Bitbucket Cloud
  • CI system: Bitbucket Cloud
  • Languages: Python
  • Potential workaround: Increase memory (it didn’t work)

Thank You!

1 Like

Hi,

Depending on the size of your projects and the rules activated, it may require more than 2Gb of memory to run an analysis.
Can you share the entire logs with debug enabled (see the doc to enable debug) please?
From the logs we’ll have a better idea where the performance issue comes from.
If your project is private, I can open a private thread to not share the logs publicly.

Benoit

1 Like

Hi @benoit,

Thanks for answering.
I am sending the attached log.

I’m just not sure if I have enabled logging in the right way.
I added the debug parameter, but I didn’t see much difference in the log.

Thanks again!

pipelineLog-13.txt (10.7 KB)

Hi,

Indeed, debug was not enabled. You have to set the variable directly in your bitbucket-pipelines.yml:

...
 - pipe: sonarsource/sonarcloud-scan:1.0.1
   variables:
     DEBUG: 'true'
....

Thank you @benoit

Now I was able to view the complete log.
I still haven’t been able to identify the cause of the problem.
I am sharing the link to the full log file. If you can take a look, I appreciate it.

pipelineLog-15.txt.zip (211.2 KB)

Thank you!

Hi,

I’m having a similar problem, and it’s just a simple node project with typescript. I’ve already increased the memory to 2gb+ and didn’t work. Have you found a solution?

Regards,
John

Hi @Eric_Sousa,

Thanks for the logs.

A colleague pointed me to this thread: Sonarcloud Bitbucket pipline failing due to exceeded memory limit. It seems that the docker memory should also be configured.
Can you please try that solution, and let me know if that solves your issue?
If so, I’ll make sure our documentation is updated.

Thanks

1 Like

Hi @benoit

Thank you very much! It worked.

I added the parameter bellow and increased the docker memory to 2048MB.

definitions:
  services:
    docker:
      memory: 2048