How can I get SonarQube to analyse C++ (& C) Header Files?

Good morning,

I am using the commercial SonarQube C++ scanner with build-wrapper-linux-x86-64 to analyse a C++ project. The build uses the GNU toolchain usually (although it also compiles with Clang). The basic scanning seems to work well and reports various issues in the web interface as expected.

What I noticed yesterday is that header files do not seem always get analysed. I have tried both .h and .hpp extensions to see if that makes a difference. I can see that my new C++ header files have a number of C headers included and I thought this would get flagged up.

I am going to try splitting some of the code form one of the files out into a .cpp file to see if that causes any more wanings to be flagged.

I did find one old review that mentioned a similar sounding issue with C++ headers: https://baptiste-wicht.com/posts/2014/10/sonarqube-inspections-for-cpp-projects.html
I do not really want to go through the suggested work around of listing all the header files by hand if that can be avoided.

Must-share information:

  • which versions are you using (SonarQube, Scanner, Plugin, and any relevant extension)
  • SonarQube Version 7.0 (build 36138) not sure how to check the other parts…
  • what are you trying to achieve - done
  • what have you tried so far to achieve this - renamed a few files, double checked my build is using them

Thomas,

Can you include the exact commands you are using to run your analysis with the build wrapper? Might be worth a peer review, you never know.

You might also want to check your global / project-level sonar.c.file.suffixes and sonar.cpp.file.suffixes to make sure they include the file extensions for headers. By default they should, but worth checking.

Colin

Certainly.

I use GitLab CI with several stages. There is a build stage that runs the build with the wrapper:

# Run SonarQube Wrapper Around Standard Debug
build:debian:debug:sonarqube:
  stage: build
  tags:
- docker
  image: tafthorne/fdev1
  before_script:
- mkdir sonarqube
- cd sonarqube
- wget
    --no-check-certificate
    --no-verbose
    https://<internal-server-URL>/static/cpp/build-wrapper-linux-x86.zip
- unzip build-wrapper-linux-x86.zip
- export PATH=$PATH:`pwd`/build-wrapper-linux-x86
- cd ..
  script:
- build-wrapper-linux-x86-64 --out-dir build_output make debug
  artifacts:
name: "${CI_JOB_NAME}_${CI_JOB_STAGE}_${CI_COMMIT_REF_NAME}"
expire_in: 4 weeks
paths:
  - "build_output/*"

After the build stage, comes a test stage that generates coverage reports. These reports are consumed by the scanner stage.

I had two scanner jobs. These were based on what advice I could find on having a C++ project that used both long living master or development branch and short-term branches used as feature branches. However I understand that the preview mode is no longer recommended, so I only have the main scanner job active.

The active scanner job looks like this:

# Run sonar-scanner
test:analysis:sonar-scanner:
  stage: test:analysis
  tags:
    - docker
  image: tafthorne/openjfx-fdev1
  dependencies:
#    - build:cppcheck
    - build:debian:debug:sonarqube
    - test:debian:unit:debug
  before_script:
    - mkdir sonarqube
    - cd sonarqube
    - wget
        --no-verbose
        https://sonarsource.bintray.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-3.1.0.1141-linux.zip
    - unzip sonar-scanner-cli-3.1.0.1141-linux.zip
    - mv `find ./ -maxdepth 1 -type d | tail -1` sonar-scanner
    - export PATH=$PATH:`pwd`/sonar-scanner/bin
    - cd ..
  script:
    - sonar-scanner
        -X
        --define sonar.branch.name=$CI_COMMIT_REF_NAME
        --define sonar.gitlab.commit_sha=$CI_COMMIT_SHA
        --define sonar.gitlab.project_id=$CI_PROJECT_ID
        --define sonar.gitlab.ref_name=$CI_COMMIT_REF_NAME
        --define sonar.gitlab.user_token=$GITLAB_TOKEN
        --define sonar.host.url=$SONARQUBE_URL
        --define sonar.login=$SONARQUBE_TOKEN
        --define sonar.verbose=true

The preview mode being that it generates some complaints but does not log results to the server’s archive is not active. In GitLab CI YML files you put a . at the start of a job name to block its existence. For the record here is what the disabled job looks like:

# Run sonar-scanner in Preview Mode
.test:analysis:sonar-scanner-preview:
  stage: test:analysis
  tags:
    - docker
  image: tafthorne/openjfx-fdev1
  dependencies:
#    - build:cppcheck
    - build:debian:debug:sonarqube
    - test:debian:unit:debug
  before_script:
    - mkdir sonarqube
    - cd sonarqube
    - wget
        --no-verbose
        https://sonarsource.bintray.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-3.1.0.1141-linux.zip
    - unzip sonar-scanner-cli-3.1.0.1141-linux.zip
    - mv `find ./ -maxdepth 1 -type d | tail -1` sonar-scanner
    - export PATH=$PATH:`pwd`/sonar-scanner/bin
    - cd ..
  script:
    - sonar-scanner
        -X
        --define sonar.analysis.mode=preview
        --define sonar.branch.name=$CI_COMMIT_REF_NAME
        --define sonar.gitlab.commit_sha=$CI_COMMIT_SHA
        --define sonar.gitlab.project_id=$CI_PROJECT_ID
        --define sonar.gitlab.ref_name=$CI_COMMIT_REF_NAME
        --define sonar.gitlab.user_token=$GITLAB_TOKEN
        --define sonar.host.url=$SONARQUBE_URL
        --define sonar.login=$SONARQUBE_TOKEN
        --define sonar.verbose=true

I expect the Sonar Project properties stored in my sonar-project.properties file will also be of interest. Here are the parts of the file I can share:

#################################################
# SonarQube Project File for DMCP
#################################################

# must be unique in a given SonarQube instance
sonar.projectKey=DMCP
# this is the name and version displayed in the SonarQube UI. Was mandatory prior to SonarQube 6.1.
sonar.projectName=Data Management and Communications Processor
sonar.projectVersion=0.0.0

# Project Link Data
sonar.projectDescription=Code that will run on the DMCP & DCLTP
sonar.links.homepage=https://<internal-url>
sonar.links.ci=https://<internal-url>
sonar.links.issue=https://<internal-url>
sonar.links.scm=https://<internal-url>

# List of the module identifiers
sonar.modules=\
    ModuleOne,\
    ModuleTwo,\
    IO,\
    Utility

# Path is relative to the sonar-project.properties file. Replace "\" by "/" on Windows.
# This property is optional if sonar.modules is set.
sonar.sources=./Sources,./Tests
Utility.sonar.sources=./Sources
sonar.cfamily.build-wrapper-output=build_output

# Exclude 3rd Party Code

# path to test source directories (optional)
#sonar.test.inclusions=Tests/

# Encoding of the source code. Default is default system encoding
#sonar.sourceEncoding=UTF-8

# Existing reports
sonar.cfamily.build-wrapper-output=build_output
#sonar.cfamily.cppunit.reportsPath=junit
sonar.cfamily.gcov.reportsPath=.
#sonar.cxx.cppcheck.reportPath=cppcheck-result-1.xml
#sonar.cxx.xunit.reportPath=cpputest_*.xml
sonar.junit.reportPaths=junit

# Plugin Configuration
# sonar-gitlab-plugin
sonar.gitlab.url=https://<internal-url>

I hope that covers most of what people may need to know. I left everything server side set to defaults, as I would much rather have it documented in the committed files.
In the above properties and commands:
There is are some elements to deal with the GitLab plugin for SonarQube which is working well; posting notes of smells and errors back to checkins and within merge requests.
I have several modules defined as that seemed to be the way I could get the code coverage to work out where all the files were correctly. Each module is able to be compiled separately into a shared library, executable or both. Most modules have a Sources & Tests directory that contains the sources and texts, the executables are generated from an Application directory that is only sometimes present.

Hopefull someone here can spot my mistake and get the header file coverage working.

Any feedback or suggestions about the other elements in this C++ SonarQube project setup would also be most welcome. I am quite new to all this and have based what I have got on the online documentaion I have found. As you can see I started off using cppcheck and the free sonar.cxx scanner but moved on to the cfamily scanner later.

Hello Thomas,

What indicator do you use to detect that headers are not analyzed?

Normally, you should not have to do anything to have your header files analyzed: Headers are not autonomous files. They only make sense in the context of a source file in which they are included. And a source file can only make sense with its included headers. Therefore, this leads to two conclusions:

  • What needs to be configured for analysis is source files only. All the (recursively) included files will be analyzed along with the source file.
  • If a header file is not included (even recursively) in any source file, it is effectively dead code, and will not be analyzed.

Thank you

Hello Loïc,

I have two source files that have full class definitions and declarations in them. These files were included by a Main.cpp file that was compiled using the wrapper. Neither reported any errors at first. It is only when I split the header file into a .cpp and a .hpp file that any analysis happened and the .cpp file had some errors marked in it.

I have also seen this with a simple header file that had some constants and structures defined in it that did not obey naming constraints, included C files instead of C++ ones and so forth. These were not flagged within the header file.

Now I go back to check my code analysis today and there are some smells marked in the header files. The confusing part is that the .cpp file that matches the now split header file does not show up. Seemly because it was not pushed to the server but my application still compiled. Something is obviously very odd in this build system.

My apologies if this turns out to be all down to a poor set of makefiles my end. Knowing that headers and other code are only evaluated if they are included in a compilation is helpful though, so thank you for that.