SonarQube Scan Fails in Azure DevOps but Works Manually in Pod

Problem:
Azure DevOps pipeline fails with memory/network errors during JavaScript analysis. Manual scan in the same pod works.

The issue is that the scan gets stuck at the last line of the log shown below.

Pipeline Configuration:

- task: SonarQubePrepare@7  
  inputs:  
    SonarQube: $(sonarConnectionService)  
    scannerMode: 'cli'  
    projectKey: $(sonarProjectKey)  
    extraProperties: |  
      sonar.host.url=$(sonarUrl)  
      sonar.sources=.  
      sonar.projectBaseDir=$(System.DefaultWorkingDirectory)  
      sonar.exclusions=**/google-map.js  
  env:  
    SONAR_SCANNER_OPTS: "-Xmx2012m"  
    NODE_OPTIONS: "--max-old-space-size=3012"  

- task: SonarQubeAnalyze@7  
  inputs:  
    jdkversion: 'JAVA_HOME'  
  env:  
    NODE_OPTIONS: "--max-old-space-size=3012"  

Error Logs:

15:34:27.894 INFO  Load project repositories
15:34:28.121 INFO  Load project repositories (done) | time=227ms
15:34:28.203 INFO  Indexing files...
15:34:28.203 INFO  Project configuration:
15:34:28.993 INFO  Some of the project files were automatically excluded because they looked like generated code. Enable debug logging to see which files were excluded. You can disable bundle detection by setting sonar.javascript.detectBundles=false
15:34:29.090 INFO  192 files indexed
15:34:29.092 INFO  Quality profile for css: Sonar way
15:34:29.093 INFO  Quality profile for docker: Sonar way
15:34:29.093 INFO  Quality profile for js: Sonar way
15:34:29.093 INFO  Quality profile for php: Sonar way
15:34:29.094 INFO  Quality profile for web: Sonar way
15:34:29.094 INFO  Quality profile for yaml: Sonar way
15:34:29.095 INFO  ------------- Run sensors on module local-project
15:34:29.388 INFO  Load metrics repository
15:34:29.410 INFO  Load metrics repository (done) | time=21ms
15:34:31.321 INFO  Sensor HTML [web]
15:34:32.417 INFO  Sensor HTML [web] (done) | time=1097ms
15:34:32.417 INFO  Sensor JaCoCo XML Report Importer [jacoco]
15:34:32.419 INFO  'sonar.coverage.jacoco.xmlReportPaths' is not defined. Using default locations: target/site/jacoco/jacoco.xml,target/site/jacoco-it/jacoco.xml,build/reports/jacoco/test/jacocoTestReport.xml
15:34:32.420 INFO  No report imported, no coverage information will be imported by JaCoCo XML Report Importer
15:34:32.420 INFO  Sensor JaCoCo XML Report Importer [jacoco] (done) | time=4ms
15:34:32.421 INFO  Sensor PHP sensor [php]
15:34:32.521 INFO  Starting PHP symbol indexer
15:34:32.592 INFO  11 source files to be analyzed
15:34:33.086 INFO  11/11 source files have been analyzed
15:34:33.087 INFO  Cached information of global symbols will be used for 0 out of 11 files. Global symbols were recomputed for the remaining files.
15:34:33.192 INFO  Starting PHP rules
15:34:33.193 INFO  11 source files to be analyzed
15:34:34.593 INFO  11/11 source files have been analyzed
15:34:35.589 INFO  Deploy location /home/agent/.sonar/js/node-runtime, tagetRuntime: /home/agent/.sonar/js/node-runtime/node,  version: /home/agent/.sonar/js/node-runtime/version.txt
15:34:40.495 INFO  Using embedded Node.js runtime.
15:34:40.495 INFO  Using Node.js executable: '/home/agent/.sonar/js/node-runtime/node'.
15:34:44.319 INFO  Memory configuration: OS (32089 MB), Node.js (524 MB).

I run into the same issue when updating the previous azure task with this one ( this linux command works properly when i do run it inside my pod )

- script: | 
              sonar-scanner -Dsonar.projectKey=$(sonarProjectKey)  -Dsonar.host.url=$(sonarUrl) -Dsonar.token=$(sonarToken) -Dsonar.sources=. 
            timeoutInMinutes: 5

Environment Details

**SonarQube**: Community Edition 25.1.0.102122

**SonarScanner**: CLI 7.0.2.4839 

**Azure Agent** : Self-hosted Kubernetes pod 

Project: PHP/JavaScript project with structure:

event-app/
├── js/
│   └── google-map.js (large/minified file)
├── php/
└── scss/

I’ve Spend a lot of time debugging this problem and i didn’t find out the root cause. Any help is appreciated
Thank you in advance

Hi,

Could you add sonar.verbose=true to your extraProperties and give us the full log, redacted as necessary, starting from the analysis command itself?

 
Thx,
Ann