Hello,
I use bitbucket pipeline and a runner with 32cpu/128GB RAM.
Sonar scan requires a lot of memory if I use the standard runner size 2x and increase docker memory to 3GB, I found the error “Container ‘docker’ exceeded memory limit.”.
So, I decided to use a pipeline runner with the aforementioned specification. The issue it seems we cannot increase the size of docker service when using the runner, because it gives me the error “A step does not have the minimum resources needed to run (1024 MB). Services on the current step are consuming 4048 MB”.
image: node:12.14.1 # Choose an image matching your project needs
clone:
depth: full # SonarCloud scanner needs the full history to assign issues properly
definitions:
caches:
sonar: ~/.sonar/cache # Caching SonarCloud artifacts will speed up your build
steps:
- step: &build-test-sonarcloud
runs-on: self.hosted
# size: 4x
name: Build, test and analyze on SonarCloud
caches:
- node # See https://confluence.atlassian.com/bitbucket/caching-dependencies-895552876.html
- sonar
script:
- npm install --quiet # Build your project and run
# - node --max-old-space-size=16384
# - CI=false npm run build
# - pwd
# - ls -l
# - npm run test
- pipe: sonarsource/sonarcloud-scan:1.2.1
services:
- docker
- step: &check-quality-gate-sonarcloud
name: Check the Quality Gate on SonarCloud
script:
- pipe: sonarsource/sonarcloud-quality-gate:0.1.4
services:
docker:
memory: 4048
pipelines: # More info here: https://confluence.atlassian.com/bitbucket/configure-bitbucket-pipelines-yml-792298910.html
branches:
integration:
- step: *build-test-sonarcloud
- step: *check-quality-gate-sonarcloud
pull-requests:
'**':
- step: *build-test-sonarcloud
- step: *check-quality-gate-sonarcloud
Any idea how to resolve this limitation ?