BitBucket (SonarQube Scanner binaries probably don't exist for your OS (linux))


(James Shinevar) #1

I have an Angular 6 web app that I am trying to analyze in a BitBucket pipeline. I have added the task, and my pipeline looks like this:

# This is a sample build configuration for JavaScript.
# Check our guides at for more examples.
# Only use spaces to indent your .yml configuration.
# -----
# You can specify a custom docker image from Docker Hub as your build environment.
image: node:10.15.0

    - step:
          - node
        script: # Modify the commands below to build your repository.
          - npm install
          - npm install -g @angular/cli
          - npm install -g sonarqube-scanner
          - npm run-script build
          - sonar-scanner -Dsonar.projectKey=moh_web_d365dataportalangularapp -Dsonar.organization=mohegan -Dsonar.sources=./dist -Dsonar.login=theKeyIsHere

When the scanner is running, I get the following:
+ sonar-scanner -Dsonar.projectKey=moh_web_d365dataportalangularapp -Dsonar.organization=mohegan -Dsonar.sources=./dist -Dsonar.login=theKeyIsHere

`[21:38:02] Starting SonarQube analysis...`

`[21:38:02] Getting info from "package.json" file`

`[21:38:02] Checking if executable exists: /root/.sonar/native-sonar-scanner/sonar-scanner-`

`[21:38:02] Could not find executable in "/root/.sonar/native-sonar-scanner".`

`[21:38:02] Proceed with download of the platform binaries for SonarQube Scanner...`

`[21:38:02] Creating /root/.sonar/native-sonar-scanner`

`[21:38:02] Downloading from`

`[21:38:02] (executable will be saved in cache folder: /root/.sonar/native-sonar-scanner)`


`INFO: Scanner configuration file: /root/.sonar/native-sonar-scanner/sonar-scanner-`

`INFO: Project root configuration file: NONE`

`INFO: SonarQube Scanner`

`INFO: Java 1.8.0_121 Oracle Corporation (64-bit)`

`INFO: Linux 4.14.84-coreos amd64`

`INFO: User cache: /root/.sonar/cache`

`INFO: SonarQube server 7.6.0`

`INFO: Default locale: "en_US", source code encoding: "US-ASCII" (analysis is platform dependent)`

`INFO: Publish mode`

`INFO: Load global settings`

`INFO: Load global settings (done) | time=553ms`

`INFO: Server id: BD367519-AWHW8ct9-T_TB3XqouNu`

`INFO: User cache: /root/.sonar/cache`

`INFO: Load/download plugins`

`INFO: Load plugins index`

`INFO: Load plugins index (done) | time=119ms`

`INFO: Load/download plugins (done) | time=37831ms`

`INFO: Loaded core extensions: branch-scanner`

`INFO: Process project properties`

`INFO: Execute project builders`

`INFO: Execute project builders (done) | time=3ms`

`INFO: Load project branches`

`INFO: Load project branches (done) | time=100ms`

`INFO: Load project pull requests`

`INFO: Load project pull requests (done) | time=96ms`

`INFO: Load branch configuration`

`INFO: Load branch configuration (done) | time=635ms`

`INFO: Load project repositories`

`INFO: Load project repositories (done) | time=147ms`

`INFO: Load quality profiles`

`INFO: Load quality profiles (done) | time=127ms`

`INFO: Load active rules`

`INFO: Load active rules (done) | time=2686ms`

`INFO: Load metrics repository`

`INFO: Load metrics repository (done) | time=102ms`

`INFO: Project key: moh_web_d365dataportalangularapp`

`INFO: Project base dir: /opt/atlassian/pipelines/agent/build`

`INFO: Organization key: mohegan`

`INFO: -------------  Scan d365-data-portal`

`INFO: Base dir: /opt/atlassian/pipelines/agent/build`

`INFO: Working dir: /opt/atlassian/pipelines/agent/build/.scannerwork`

`INFO: Source paths: dist`

`INFO: Source encoding: US-ASCII, default locale: en_US`

`WARN: Property 'sonar.abap.file.suffixes' is not declared as multi-values/property set but was read using 'getStringArray' method. The SonarQube plugin declaring this property should be updated.`

`INFO: Index files`

`INFO: Excluded sources: `

`INFO:   node_modules/**`

`INFO:   bower_components/**`

`INFO:   jspm_packages/**`

`INFO:   typings/**`

`INFO:   lib-cov/**`

`WARN: Invalid character encountered in file /opt/atlassian/pipelines/agent/build/dist/D365DataPortal/polyfills.js at line 1810 for encoding US-ASCII. Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.`

`WARN: Invalid character encountered in file /opt/atlassian/pipelines/agent/build/dist/D365DataPortal/vendor.js at line 37431 for encoding UTF-8. Please fix file content or configure the encoding to be used using property 'sonar.sourceEncoding'.`

`INFO: 11 files indexed`

`INFO: 1 file ignored because of inclusion/exclusion patterns`

`INFO: Quality profile for js: Sonar way`

`INFO: Quality profile for web: Sonar way`

`INFO: Sensor SonarJavaXmlFileSensor [java]`

`INFO: Sensor SonarJavaXmlFileSensor [java] (done) | time=0ms`

`INFO: Sensor HTML [web]`

`INFO: Sensor HTML [web] (done) | time=53ms`

`INFO: Sensor JaCoCo XML Report Importer [jacoco]`

`INFO: Sensor JaCoCo XML Report Importer [jacoco] (done) | time=2ms`

`INFO: Sensor SonarJS [javascript]`

`INFO: 4 source files to be analyzed`

`INFO: 3/4 files analyzed, current file: dist/D365DataPortal/vendor.js`

`INFO: 3/4 files analyzed, current file: dist/D365DataPortal/vendor.js`

`INFO: 3/4 files analyzed, current file: dist/D365DataPortal/vendor.js`

`[21:39:39] ERROR: impossible to download and extract binary: Command failed: /root/.sonar/native-sonar-scanner/sonar-scanner- -Dsonar.projectKey=moh_web_d365dataportalangularapp -Dsonar.organization=mohegan -Dsonar.sources=./dist -Dsonar.login=theKeyIsHere`

`[21:39:39]        SonarQube Scanner binaries probably don't exist for your OS (linux).`

`[21:39:39]        In such situation, the best solution is to install the standard SonarQube Scanner (requires a JVM).`

`[21:39:39]        Check it out at`

I am not sure what to do. Can someone help?

(Wouter Admiraal) #2

Hi James,

That’s very strange. Even more so as that specific build of the scanner is heavily used. I just tried analysing a JS project using a very similar pipeline, and it passed:

image: node:10.15.0

    - step:
          - node
          - npm install -g sonarqube-scanner
          - |
            sonar-scanner \
            -Dsonar.login=$SONAR_TOKEN \
            -Dsonar.projectKey=wouter-admiraal-sonarsource_sonarcloud-test \
            -Dsonar.organization=wouter-admiraal-sonarsource-bitbucket \
            -Dsonar.sources=. \

It used the same scanner build (, and the analysis was pushed correctly. On the off-chance this was a network failure (corrupt ZIP?), did you try running it again?

FYI, we also have an example JS repo here, but from your config, I don’t see any issues.

(As an aside: did you know you don’t have to pass your login token on the CLI? Check here for the docs)

(James Shinevar) #3

@Wouter_Admiraal Thank you for your response. I was aware that I didn’t need to have the key in the config, I was just trying to make sure there was no issues before actually implementing this. I did try running it again. However, that was on Friday. I’ll give it another shot.

(James Shinevar) #4

It still is failing. I created a different organization and made it work using Azure DevOps, however, I would rather use it in BitBucket, since that is where all of our code is stored.

(Wouter Admiraal) #5

Hm, odd.

Are you using BitBucket Cloud or BitBucket Server? If Server, could it be a firewall or reverse proxy messing with the analyser archive that is being downloaded?

(James Shinevar) #6

BitBucket Cloud.

(James Shinevar) #7

We really wanted to have this integrated with BitBucket because we have our code there, but, we have decided to integrate it with Azure DevOps. It would be nice if SonarCloud understood that these accounts were not mutually exclusive and allowed us to be easily connected to both BitBucket and Azure DevOps.

(Fabrice Bellingard) #8

Hey James,

Looking at the logs, I have the feeling that the analysis is failing due to an out of memory exception or something unexpected like this. Why I’m saying this? Because the 3 last INFO log is about analyzing the same JS vendor file:

  • Vendor files can be minified but still huge
  • and therefore our analyzer can be into trouble to parse them with the available default memory settings

I suggest that you try to exclude those files from the analysis and re-run it to see if this fixes the problem. To exclude those files, you can set the following property for instance:


(James Shinevar) #9

The issue is kind of moot now, since we have decided to use DevOps instead of BitBucket. Why would it work differently between the two products? Does DevOps offer more resources to the build machines?

(Fabrice Bellingard) #10

This might be a possibility, I don’t know.