How can I analyze both python and C code in the same repo?

I’ve got a project where we’ve got C code, and then we’re using CFFI for python to create python interfaces. We have ceedling tests for the C code, and we have unittest tests for the python.

In the bitbucket-pipelines.yml, the sonarcloud step to analyze the C is

- step: &sonarcloud-for-c
        name: Sonar Cloud Analysis
        script:
          - test -f environment.sh && source environment.sh
          - export SONAR_SCANNER_VERSION=4.4.0.2170
          - export SONAR_SCANNER_HOME=$HOME/.sonar/sonar-scanner-$SONAR_SCANNER_VERSION-linux
          - export BW_OUTPUT=$HOME/.sonar/bw-output
          - mkdir -p $BW_OUTPUT
          - curl --create-dirs -sSLo $HOME/.sonar/sonar-scanner.zip https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-$SONAR_SCANNER_VERSION-linux.zip
          - unzip -o $HOME/.sonar/sonar-scanner.zip -d $HOME/.sonar/
          - export PATH=$SONAR_SCANNER_HOME/bin:$PATH
          - export SONAR_SCANNER_OPTS="-server"
          - curl --create-dirs -sSLo $HOME/.sonar/build-wrapper-linux-x86.zip https://sonarcloud.io/static/cpp/build-wrapper-linux-x86.zip
          - unzip -o $HOME/.sonar/build-wrapper-linux-x86.zip -d $HOME/.sonar/
          - export PATH=$HOME/.sonar/build-wrapper-linux-x86:$PATH
          - build-wrapper-linux-x86-64 --out-dir $BW_OUTPUT ./cmake/scripts/cmake_cmd_line_build.sh -w bs1 -d .
          - ceedling gcov:all utils:gcov
          - sonar-scanner -Dsonar.cfamily.build-wrapper-output=$BW_OUTPUT

How can I add python analysis to this? For other repos with python, I see a motif more like

- step: &sonarcloud-for-python
        name: Analyze on SonarCloud
        caches:
          - sonar
        script:
          - pipe: sonarsource/sonarcloud-scan:1.2.2
          - pipe: sonarsource/sonarcloud-quality-gate:0.1.4

But if I try to have steps for both of these in the same bitbucket-pipelines.yml, then the C sonarcloud step passes, but the python one fails the quality gate with 0% coverage, even though the tests are getting run and producing 100% coverage.

Do I need to set up a monorepo with two projects? Can I do the analysis in a single sonarcloud project? I’ve been reading documentation and forum posts for a long time, but I still don’t understand. I just want coverage of these two things to be reflected in sonarcloud.

My sonar-project.properties looks like

sonar.exclusions=lib/**/*

sonar.coverageReportPaths=build/artifacts/gcov/GcovCoverageSonarQube.xml

sonar.python.coverage.reportPaths=coverage.xml
1 Like

Hey there.

This should be easy as long as the Python files are in a subdirectory from where you start the scan. Where do your Python files sit relative to the directory you’re executing the sonar-scanner from?

The C files live in a repo subdirectory called source, and the python files all live in a repo subdirectory called python_cffi. I believe the sonar steps both run from the repo root.

What should I try? I’ve been experimenting for a while, and it certainly hasn’t been obvious to me. Do I need to set up a monorepo with two projects? Can I do the analysis in a single sonarcloud project?

I’ve configured as a monorepo and attached two projects to the repo, one C, one python. Helpfully, when I do this setup, the website gives me code snippets to copy and paste in to bitbucket-pipelines.yml. I did this. The C analysis still passes, and the python one still doesn’t. It’s very strange. The full printout in the pipeline terminal is

Unable to find image 'sonarsource/sonarcloud-scan:1.4.0' locally
1.4.0: Pulling from sonarsource/sonarcloud-scan
743f2d6c1f65: Pulling fs layer
5c14188d0980: Pulling fs layer
84a54a21b0f2: Pulling fs layer
d9d1bc98b287: Pulling fs layer
c99e982badff: Pulling fs layer
50c6b4dfde57: Pulling fs layer
a561eaec2f8f: Pulling fs layer
ea88b09234a3: Pulling fs layer
83613aeae58d: Pulling fs layer
5d476eee175a: Pulling fs layer
3d3951d29cc9: Pulling fs layer
a561eaec2f8f: Waiting
ea88b09234a3: Waiting
83613aeae58d: Waiting
3d3951d29cc9: Waiting
d9d1bc98b287: Waiting
c99e982badff: Waiting
84a54a21b0f2: Verifying Checksum
84a54a21b0f2: Download complete
5c14188d0980: Verifying Checksum
5c14188d0980: Download complete
743f2d6c1f65: Verifying Checksum
743f2d6c1f65: Download complete
c99e982badff: Verifying Checksum
c99e982badff: Download complete
d9d1bc98b287: Verifying Checksum
d9d1bc98b287: Download complete
ea88b09234a3: Download complete
a561eaec2f8f: Verifying Checksum
a561eaec2f8f: Download complete
5d476eee175a: Verifying Checksum
5d476eee175a: Download complete
50c6b4dfde57: Verifying Checksum
50c6b4dfde57: Download complete
743f2d6c1f65: Pull complete
83613aeae58d: Verifying Checksum
83613aeae58d: Download complete
3d3951d29cc9: Verifying Checksum
3d3951d29cc9: Download complete
5c14188d0980: Pull complete
84a54a21b0f2: Pull complete
d9d1bc98b287: Pull complete
c99e982badff: Pull complete
50c6b4dfde57: Pull complete
a561eaec2f8f: Pull complete
ea88b09234a3: Pull complete
83613aeae58d: Pull complete
5d476eee175a: Pull complete
3d3951d29cc9: Pull complete
Digest: sha256:8b3690666e34b17bbab84370e569151742f06f21575fbe05e5c066c160b7c968
Status: Downloaded newer image for sonarsource/sonarcloud-scan:1.4.0
INFO: Scanner configuration file: /opt/sonar-scanner/conf/sonar-scanner.properties
INFO: Project root configuration file: /opt/atlassian/pipelines/agent/build/sonar-project.properties
INFO: SonarScanner 4.6.2.2472
INFO: Java 11.0.3 Oracle Corporation (64-bit)
INFO: Linux 5.13.0-1031-aws amd64
INFO: Bitbucket Cloud Pipelines detected, no host variable set. Defaulting to sonarcloud.io.
INFO: User cache: /root/.sonar/cache
INFO: Scanner configuration file: /opt/sonar-scanner/conf/sonar-scanner.properties
INFO: Project root configuration file: /opt/atlassian/pipelines/agent/build/sonar-project.properties
INFO: Analyzing on SonarQube server 8.0.0.30222
INFO: Default locale: "en", source code encoding: "UTF-8" (analysis is platform dependent)
INFO: Load global settings
INFO: Load global settings (done) | time=598ms
INFO: Server id: 1BD809FA-AWHW8ct9-T_TB3XqouNu
INFO: User cache: /root/.sonar/cache
INFO: Load/download plugins
INFO: Load plugins index
INFO: Load plugins index (done) | time=139ms
INFO: Load/download plugins (done) | time=38649ms
INFO: Loaded core extensions: developer-scanner
INFO: Found an active CI vendor: 'Bitbucket Pipelines'
INFO: No project and organization key detected from Bitbucket Cloud Pipelines because it is a monorepo.
INFO: Load project settings for component key: 'bio-firmware-algorithms-python'
INFO: Load project settings for component key: 'bio-firmware-algorithms-python' (done) | time=110ms
INFO: Process project properties
INFO: Execute project builders
INFO: Execute project builders (done) | time=1ms
INFO: Project key: bio-firmware-algorithms-python
INFO: Base dir: /opt/atlassian/pipelines/agent/build
INFO: Working dir: /opt/atlassian/pipelines/agent/build/.scannerwork
INFO: Load project branches
INFO: Load project branches (done) | time=119ms
INFO: Check ALM binding of project 'bio-firmware-algorithms-python'
INFO: Detected project binding: BOUND
INFO: Check ALM binding of project 'bio-firmware-algorithms-python' (done) | time=104ms
INFO: Load project pull requests
INFO: Load project pull requests (done) | time=107ms
INFO: Load branch configuration
INFO: Detected analysis for pull request '23' targeting 'master'
INFO: Auto-configuring pull request 23
INFO: Load branch configuration (done) | time=285ms
INFO: Load quality profiles
INFO: Load quality profiles (done) | time=172ms
INFO: Load active rules
Build teardown
<1s
Skipping cache upload for failed step
Searching for files matching artifact pattern .bitbucket/pipelines/generated/pipeline/pipes/**
Artifact pattern .bitbucket/pipelines/generated/pipeline/pipes/** matched 1 files with a total size of 7.4 KiB
Compressed files matching artifact pattern .bitbucket/pipelines/generated/pipeline/pipes/** to 2.2 KiB in 0 seconds
Uploading artifact of 2.2 KiB
Successfully uploaded artifact in 0 seconds
Searching for test report files in directories named [test-results, failsafe-reports, test-reports, TestResults, surefire-reports] down to a depth of 4
Finished scanning for test reports. Found 0 test report files.
Merged test suites, total number tests is 0, with 0 failures and 0 errors.

It seems from this printout that I should be able to analyze both C and python using the sonarcloud-scan step that’s failing

INFO: Indexing files...
INFO: Project configuration:
INFO:   Excluded sources: **/build-wrapper-dump.json, lib/**/*
INFO: 201 files indexed
INFO: 317 files ignored because of inclusion/exclusion patterns
INFO: 0 files ignored because of scm ignore settings
INFO: Quality profile for c: Sonar way
INFO: Quality profile for py: Sonar way
INFO: Quality profile for yaml: Sonar way
INFO: ------------- Run sensors on module bio-firmware-algorithms-python
INFO: Load metrics repository
INFO: Load metrics repository (done) | time=142ms

Okay, so if I download the full log, I get more details

WARN: No report was found for sonar.python.coverage.reportPaths using pattern coverage.xml
INFO: Sensor Cobertura Sensor for Python coverage [python] (done) | time=34ms
INFO: Sensor PythonXUnitSensor [python]
INFO: Sensor PythonXUnitSensor [python] (done) | time=18ms
INFO: Sensor JaCoCo XML Report Importer [jacoco]
INFO: 'sonar.coverage.jacoco.xmlReportPaths' is not defined. Using default locations: target/site/jacoco/jacoco.xml,target/site/jacoco-it/jacoco.xml,build/reports/jacoco/test/jacocoTestReport.xml
INFO: No report imported, no coverage information will be imported by JaCoCo XML Report Importer
INFO: Sensor JaCoCo XML Report Importer [jacoco] (done) | time=4ms
INFO: Sensor CSS Rules [javascript]
INFO: Sensor CSS Rules is restricted to changed files only
INFO: No CSS, PHP, HTML or VueJS files are found in the project. CSS analysis is skipped.
INFO: Sensor CSS Rules [javascript] (done) | time=2ms
INFO: Sensor ThymeLeaf template sensor [securityjavafrontend]
INFO: Sensor ThymeLeaf template sensor [securityjavafrontend] (done) | time=2ms
INFO: Sensor Python HTML templates processing [securitypythonfrontend]
INFO: Sensor Python HTML templates processing [securitypythonfrontend] (done) | time=42ms
INFO: Sensor Serverless configuration file sensor [security]
INFO: 0 Serverless function entries were found in the project
INFO: 0 Serverless function handlers were kept as entrypoints
INFO: Sensor Serverless configuration file sensor [security] (done) | time=9ms
INFO: Sensor AWS SAM template file sensor [security]
INFO: Sensor AWS SAM template file sensor [security] (done) | time=2ms
INFO: Sensor Generic Coverage Report
INFO: Parsing /opt/atlassian/pipelines/agent/build/build/artifacts/gcov/GcovCoverageSonarQube.xml
INFO: ------------------------------------------------------------------------
INFO: EXECUTION FAILURE
INFO: ------------------------------------------------------------------------
INFO: Total time: 1:06.088s
INFO: Final Memory: 43M/160M
INFO: ------------------------------------------------------------------------
ERROR: Error during SonarScanner execution
ERROR: Error during parsing of the generic coverage report '/opt/atlassian/pipelines/agent/build/build/artifacts/gcov/GcovCoverageSonarQube.xml'. Look at SonarQube documentation to know the expected XML format.
ERROR: Caused by: /opt/atlassian/pipelines/agent/build/build/artifacts/gcov/GcovCoverageSonarQube.xml (No such file or directory)

The gcov stuff is only relevant for the C. I want this project to be analyzed based on the coverage.xml, which it apparently can’t find.

My step to run coverage looks like

- step: &cffi-unit-tests-and-coverage
        name: CFFI Unit Tests
        caches:
          - pip
        script:
          - cd python_cffi
          - apt install libffi-dev # without this: c/_cffi_backend.c:15:10: fatal error: ffi.h: No such file or directory #include <ffi.h>
          - python3 -m pip install --upgrade pip
          - python3 -m pip install -r requirements.txt
          - coverage run --omit=/usr/local/lib/* -m unittest
          - coverage report
          - coverage xml -o ../coverage.xml
        artifacts:
          - coverage.xml

That step succeeds. During build teardown. I see

Searching for files matching artifact pattern coverage.xml
Artifact pattern coverage.xml matched 1 files with a total size of 6.3 KiB
Compressed files matching artifact pattern coverage.xml to 860 B in 0 seconds
Uploading artifact of 860 B
Successfully uploaded artifact in 0 seconds

Since the sonar project properties from the C analysis interfere with the python analysis, do I need to configure those properties through the website uniquely to each project attached to my monorepo and not have a .properties file?

It’s still unclear to me whether I actually need a monorepo with the two separate projects, though. I’d prefer to do this as one. Is that possible?

Okay, good news, my steps were in a -parallel in the bitbucket-pipelines.yml. If I move them out of that, then the last step finds the coverage.xml, but the python analysis is still getting tripped up by the presence of GcovCoverageSonarQube.xml

INFO: Python test coverage
INFO: Parsing report '/opt/atlassian/pipelines/agent/build/coverage.xml'
INFO: Sensor Cobertura Sensor for Python coverage [python] (done) | time=97ms
INFO: Sensor PythonXUnitSensor [python]
INFO: Sensor PythonXUnitSensor [python] (done) | time=14ms
INFO: Sensor JaCoCo XML Report Importer [jacoco]
INFO: 'sonar.coverage.jacoco.xmlReportPaths' is not defined. Using default locations: target/site/jacoco/jacoco.xml,target/site/jacoco-it/jacoco.xml,build/reports/jacoco/test/jacocoTestReport.xml
INFO: No report imported, no coverage information will be imported by JaCoCo XML Report Importer
INFO: Sensor JaCoCo XML Report Importer [jacoco] (done) | time=4ms
INFO: Sensor CSS Rules [javascript]
INFO: Sensor CSS Rules is restricted to changed files only
INFO: No CSS, PHP, HTML or VueJS files are found in the project. CSS analysis is skipped.
INFO: Sensor CSS Rules [javascript] (done) | time=1ms
INFO: Sensor ThymeLeaf template sensor [securityjavafrontend]
INFO: Sensor ThymeLeaf template sensor [securityjavafrontend] (done) | time=1ms
INFO: Sensor Python HTML templates processing [securitypythonfrontend]
INFO: HTML files are not indexed : you may want to add them in the scanned files of this project to detect Python XSS vulnerabilities
INFO: Sensor Python HTML templates processing [securitypythonfrontend] (done) | time=6ms
INFO: Sensor Serverless configuration file sensor [security]
INFO: 0 Serverless function entries were found in the project
INFO: 0 Serverless function handlers were kept as entrypoints
INFO: Sensor Serverless configuration file sensor [security] (done) | time=4ms
INFO: Sensor AWS SAM template file sensor [security]
INFO: Sensor AWS SAM template file sensor [security] (done) | time=2ms
INFO: Sensor Generic Coverage Report
INFO: Parsing /opt/atlassian/pipelines/agent/build/build/artifacts/gcov/GcovCoverageSonarQube.xml
INFO: Imported coverage data for 21 files
INFO: Sensor Generic Coverage Report (done) | time=49ms
INFO: Sensor CFamily [cpp]
INFO: CFamily plugin version: 6.35.0.50389
ERROR: 

The only way to get an accurate analysis of C/C++/Objective-C files is by using the SonarSource build-wrapper and setting the property "sonar.cfamily.build-wrapper-output" or by using Clang Compilation Database and setting the property "sonar.cfamily.compile-commands". None of these two options were specified.

If you don't want to analyze C/C++/Objective-C files, then prevent them from being analyzed by setting the following properties:

    sonar.c.file.suffixes=-
    sonar.cpp.file.suffixes=-
    sonar.objc.file.suffixes=-


INFO: ------------------------------------------------------------------------
INFO: EXECUTION FAILURE
INFO: ------------------------------------------------------------------------
INFO: Total time: 50.635s
INFO: Final Memory: 43M/168M
INFO: ------------------------------------------------------------------------
ERROR: Error during SonarScanner execution
java.lang.UnsupportedOperationException: 

The only way to get an accurate analysis of C/C++/Objective-C files is by using the SonarSource build-wrapper and setting the property "sonar.cfamily.build-wrapper-output" or by using Clang Compilation Database and setting the property "sonar.cfamily.compile-commands". None of these two options were specified.

If you don't want to analyze C/C++/Objective-C files, then prevent them from being analyzed by setting the following properties:

    sonar.c.file.suffixes=-
    sonar.cpp.file.suffixes=-
    sonar.objc.file.suffixes=-


	at com.sonar.cpp.plugin.CFamilySensor.process(CFamilySensor.java:260)
	at com.sonar.cpp.plugin.CFamilySensor.execute(CFamilySensor.java:204)
	at org.sonar.scanner.sensor.AbstractSensorWrapper.analyse(AbstractSensorWrapper.java:62)
	at org.sonar.scanner.sensor.ModuleSensorsExecutor.execute(ModuleSensorsExecutor.java:75)
	at org.sonar.scanner.sensor.ModuleSensorsExecutor.execute(ModuleSensorsExecutor.java:51)
	at org.sonar.scanner.scan.ModuleScanContainer.doAfterStart(ModuleScanContainer.java:64)
	at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)
	at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)
	at org.sonar.scanner.scan.ProjectScanContainer.scan(ProjectScanContainer.java:446)
	at org.sonar.scanner.scan.ProjectScanContainer.scanRecursively(ProjectScanContainer.java:442)
	at org.sonar.scanner.scan.ProjectScanContainer.doAfterStart(ProjectScanContainer.java:400)
	at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)
	at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)
	at org.sonar.scanner.bootstrap.GlobalContainer.doAfterStart(GlobalContainer.java:128)
	at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)
	at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)
	at org.sonar.batch.bootstrapper.Batch.doExecute(Batch.java:58)
	at org.sonar.batch.bootstrapper.Batch.execute(Batch.java:52)
	at org.sonarsource.scanner.api.internal.batch.BatchIsolatedLauncher.execute(BatchIsolatedLauncher.java:46)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.sonarsource.scanner.api.internal.IsolatedLauncherProxy.invoke(IsolatedLauncherProxy.java:60)
	at com.sun.proxy.$Proxy0.execute(Unknown Source)
	at org.sonarsource.scanner.api.EmbeddedScanner.doExecute(EmbeddedScanner.java:189)
	at org.sonarsource.scanner.api.EmbeddedScanner.execute(EmbeddedScanner.java:138)
	at org.sonarsource.scanner.cli.Main.execute(Main.java:112)
	at org.sonarsource.scanner.cli.Main.execute(Main.java:75)
	at org.sonarsource.scanner.cli.Main.main(Main.java:61)
ERROR: 
ERROR: Re-run SonarScanner using the -X switch to enable full debug logging.
e[31m✖ SonarCloud analysis failed. (exit code = 1)e[0m

Okay, I figured it out. The reason it was failing is because the sonarcloud for c step wasn’t looking for the python coverage artifact, and the sonarcloud for python step was using the default sonarcloud-scan, which doesn’t work with c.

But since the c step is downloading and using sonar-scanner on command line, and this should in theory be able to do other languages at the same time, all you have to do is point it at the python too with

- sonar-scanner -Dsonar.projectKey=bio-firmware-algorithms -Dsonar.organization=bio -Dsonar.exclusions=lib/**/* -Dsonar.cfamily.build-wrapper-output=$BW_OUTPUT -Dsonar.python.coverage.reportPaths=coverage.xml

So no, I do not need a monorepo with multiple projects attached.

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.