"Python Analyzer 'Rules Execution' Causes >8GB Memory Crash on Small Project"

My SonarCloud analysis in an 8GB Bitbucket Pipeline is consistently crashing with a memory error. We have used debugging tools to isolate the problem to the ‘rules execution’ phase of the Python Sensor.

The crash occurs even when scanning a small number of Python files (~30).

We have already confirmed the crash still happens with all standard optimizations:

  • Bitbucket Pipeline step with size: 2x (8GB).

  • Scanner Java heap space set to 6GB (-Xmx6144m).

  • Symbolic Execution is disabled (-Dsonar.python.symbolicExecution.enabled=false).

  • All dependencies in .venv are excluded.

  • Test coverage and test execution report imports are disabled.

This demonstrates a severe memory consumption issue within the Python analyzer’s core rules engine. The only workaround is to exclude all Python files (**/*.py)."

I have never used sonar cloud so any help is appreciated and my pipeline is “image: python:3.13-slim-bookworm

clone:

depth: full # SonarQube Cloud scanner needs the full history to assign issues properly

definitions:

caches:

poetry: \~/.cache/pypoetry # Cache poetry dependencies

sonar: \~/.sonar/cache  # Caching SonarQube Cloud artifacts will speed up your build

steps:

- step: &build-test-sonarcloud

    name: Build, test and analyze on SonarQube Cloud

    size: 2x # Double the memory for this step (# 24 - 4618992: SonarScanner was killed for using too much memory)

    artifacts: # Pass the scanner report to the next step

      - backend/.scannerwork/\*\*

    services: # Add the docker service to run docker commands

      - docker

    caches:

      - pip

      - poetry

      - sonar

    script: 

      - pip install poetry pytest pytest-cov # Install poetry and test tools

      - cd backend # Navigate into the backend directory

      - poetry config virtualenvs.in-project true # Tell poetry to create .venv in the project folder

      - poetry install --no-interaction --no-ansi # Install project dependencies

      - poetry run pytest --cov=./ --cov-report=xml # Run tests and generate coverage report

      - pipe: sonarsource/sonarcloud-scan:4.0.0

        variables:

          SONAR_SCANNER_OPTS: "-Xmx6144m"

          EXTRA_ARGS: "-Dsonar.sources=. -Dsonar.exclusions=\*\*/.venv/\*\*,\*\*/\*.py -Dsonar.python.symbolicExecution.enabled=false"

- step: &check-quality-gate-sonarcloud

    name: Check the Quality Gate on SonarQube Cloud

    script:

      - pipe: sonarsource/sonarcloud-quality-gate:0.1.6

pipelines:

branches:

main:

  - step: \*build-test-sonarcloud

  - step: \*check-quality-gate-sonarcloud

pull-requests:

'\*\*':

  - step: \*build-test-sonarcloud

  - step: \*check-quality-gate-sonarcloud “

I have fixed and its no longer an issue. The allocation of Memory was incorrect.