Service memory size when using a pipeline runner in bitbucket

Hello,
I use bitbucket pipeline and a runner with 32cpu/128GB RAM.
Sonar scan requires a lot of memory if I use the standard runner size 2x and increase docker memory to 3GB, I found the error “Container ‘docker’ exceeded memory limit.”.
So, I decided to use a pipeline runner with the aforementioned specification. The issue it seems we cannot increase the size of docker service when using the runner, because it gives me the error “A step does not have the minimum resources needed to run (1024 MB). Services on the current step are consuming 4048 MB”.

image: node:12.14.1 # Choose an image matching your project needs

clone:
  depth: full              # SonarCloud scanner needs the full history to assign issues properly

definitions:
  caches:
    sonar: ~/.sonar/cache  # Caching SonarCloud artifacts will speed up your build
  steps:
  - step: &build-test-sonarcloud
      runs-on: self.hosted
#      size: 4x
      name: Build, test and analyze on SonarCloud
      caches:
        - node # See https://confluence.atlassian.com/bitbucket/caching-dependencies-895552876.html
        - sonar
      script:
        - npm install --quiet # Build your project and run
#        - node --max-old-space-size=16384
#        - CI=false npm run build
#        - pwd
#        - ls -l
#        - npm run test
        - pipe: sonarsource/sonarcloud-scan:1.2.1
      services:
        - docker
  - step: &check-quality-gate-sonarcloud
      name: Check the Quality Gate on SonarCloud
      script:
        - pipe: sonarsource/sonarcloud-quality-gate:0.1.4
  services:
    docker:
      memory: 4048

pipelines:                 # More info here: https://confluence.atlassian.com/bitbucket/configure-bitbucket-pipelines-yml-792298910.html
  branches:
    integration:
      - step: *build-test-sonarcloud
      - step: *check-quality-gate-sonarcloud
  pull-requests:
    '**':
      - step: *build-test-sonarcloud
      - step: *check-quality-gate-sonarcloud

Any idea how to resolve this limitation ?

Hi @wassim

To increase the memory available to the analysis step, you can follow the procedure described in that similar thread

  • Set the size of the step executing the sonarcloud-scan to 2x (the only supported values at the moment are 1x, the default, and 2x, according to the documentation). It will increase the step memory to 8096MB.

  • Keep the services.docker.memory setting to 4048 is you did, or increase it a bit if it still fails. According to that documentation, you cant allocate more than 7128 MB to Docker on a 2x step.

HTH,
Claire