Coverage reports is not generating after sonar scan analysis


Happy New Year!!

I would like to reach out to community regarding the sonar scan analysis for Carma-streets (, Coverage reports are not generated after analysis. Our project is Open source, Could you please help me out here. Thank you

Please find more details of the Project and Repository.

GitHub repository URL: GitHub - usdot-fhwa-stol/carma-streets: CARMA Streets is a component of CARMA ecosystem, which enables such a coordination among different transportation users. This component provides an interface for CDA participants to interact with the road infrastructure.
Sonar scan properties : carma-streets/ at develop · usdot-fhwa-stol/carma-streets · GitHub
coverage script: carma-streets/ at develop · usdot-fhwa-stol/carma-streets · GitHub

Circle CI build URL:
Output log file:
Sonar scan log file.txt (57.9 KB)


Happy New Year. :smiley:

What you’ve posted seems to be a build log rather than an analysis log. Is your analysis log available?


Hi Ann,

Here is the Sonar scan analysis report:
Sonar scan analysis.txt (3.4 MB).

Thank you,

Hi Sai,

In fact, SonarCloud shows your project having 4.3% coverage, so it is picking up something. That jibes with what I’m seeing in your analysis log, which is a lot of

WARN: File not analysed by Sonar, so ignoring coverage: [file path]

And just a little bit of
`DEBUG: ‘kafka_clients/test/[file name]’ generated metadata as test with charset ‘UTF-8’

In fact, your Coverage drilldown only shows a total of 20 files. Does that sound right?


Hey Ann,

I am working with Sai. I think it is 40 files in total. I think the 20 are just want is considered for new code. We do generate some reported coverage with our sonar scan report. The reason we are concerned is because our line coverage reported by gcovr is substantially higher than that reported by sonar scanner. I do not understand how this can happen since we pass this report to sonar scanner. An example is our kafka-clients module which on sonar cloud is reporting 5.5 % coverage while the gcovr report is reporting 50% coverage. There are files that the gcovr report is recognizing as 80% covered and sonar scanner is recognizing as 0 %. We are currently passing all the .gcov files to sonar scanner via the file. Any ideas why this large discrepancy exists.

Hi @paulbourelly999,

Yes, you’re right. Sorry. Apparently it’s been a longer day than I realized. :roll_eyes:

Can you check your gcovr report and make sure it includes all the files that SonarCloud does? If for instance, it’s only using the 10 files with the most coverage and ignoring 10 without any coverage, that’s obviously going to yield a different percentage.


It is including all the same files. Additionally, like i mentioned early, some files with >50% coverage from gcovr have 0% in sonar scanner.

                           GCC Code Coverage Report
Directory: .
File                                       Lines    Exec  Cover   Missing
include/kafka_consumer_worker.h               48      14    29%   33,35-36,40,42-43,45-46,48-50,52-53,57,59-60,62-63,66,68-72,88,93-95,101-103,105-107
include/kafka_producer_worker.h               28      21    75%   29,31-33,59-61
src/kafka_client.cpp                          46      25    54%   46,61-62,74-75 [* 15,17-19,31,33-35,48,50-52,78,80-82]
src/kafka_consumer_worker.cpp                 92      56    60%   21-22,28-29,36-37,42-43,49-50,55,62-63,73-74,80-81,103-104,135-144,146-148,150,152-154
src/kafka_producer_worker.cpp                 63      44    69%   25-26,32-33,39-40,47-48,58,60,71,75-76,91-92,104-105 [* 148,151]
src/main.cpp                                  40       0     0%   3,5-12,14,18-19,21,24,26-27,29,33,35,38,40-46,48,52-53,55,57,59,61,64,67,69-72
TOTAL                                        317     160    50%

Sonar scanner reports both consumer_worker and producer_worker as 0% coverage.

The sonar scan report does seem to include some test directories even though I tried excluding them via the but that is not our main issue.


Thanks for your patience with me. Seeing your line-by-line coverage report was helpful for me to realize that I’m out of my depth:

I’ve tagged this for more expert attention.


Hello team, Is there any other ideas what we can try to fix this?

Hey Ann,

No once has responded since your response. Could you please confirm if someone is currently looking into this.


Unfortunately, I can only confirm that it’s queued.


Thank you, any additional information about where we are in the queue or timeline until we get some response would be helpful.

Hi @paulbourelly999 and @SaikrishnaBairamoni,
I’ve had a look at your issue. The problem here is with gcovr tool.

Let’s consider the kafka_consumer_worker.cpp file. The test coverage for this file is generated by this piece of your script:

cd /home/carma-streets/kafka_clients/build/
ls -a
./kafka_clients_test --gtest_output=xml:../../test_results/
cd /home/carma-streets/kafka_clients/
mkdir coverage
gcovr --exclude=./test --exclude=./build/CMakeFiles -k -r .
mv *.gcov coverage

Here you invoke gcovr to crawl the directory tree, find all *.gcda files generated by kafka_clients_test and generate a coverage summary as well as detailed line-by-line reports dumped as *.gcov files that you move to the coverage directory in the last line.

SonarCloud uses these detailed line-by-line reports (.gcov files) to calculate the test coverage of your code.

If you take a look at the .gcov file that describes our chosen kafka_consumer_worker.cpp (it’s called #home#carma-streets#kafka_clients#src#kafka_consumer_worker.cpp.gcov) you can observe that no lines marked as covered. So despite gcovr reporting non-zero coverage, the detailed report does not mark any of the lines as covered. You can view the file by rerunning your CI job with an ssh access, or inspecting it locally if you have a build environment. It is a text file.

So SonarCloud correctly displays 0 coverage for kafka_consumer_worker.cpp file based on the detailed .gcov reports it is provided with.

Now why does gcovr prints 60% coverage but produces a contradicting line-by-line break down with no line marked as covered? Let’s run it with a -v flag, and filter only the relevant logs:

$ gcovr -v -k -r . 2>/dev/null | grep Running | grep consumer_worker.cpp
Running gcov: 'gcov /home/carma-streets/kafka_clients/build/CMakeFiles/kafka_clients_test.dir/test/test_kafka_consumer_worker.cpp.gcda --branch-counts --branch-probabilities --preserve-paths --object-directory /home/carma-streets/kafka_clients/build/CMakeFiles/kafka_clients_test.dir/test' in '/tmp/tmph9itq6wr'
Running gcov: 'gcov /home/carma-streets/kafka_clients/build/CMakeFiles/kafka_clients_lib.dir/src/kafka_consumer_worker.cpp.gcda --branch-counts --branch-probabilities --preserve-paths --object-directory /home/carma-streets/kafka_clients/build/CMakeFiles/kafka_clients_lib.dir/src' in '/tmp/tmph9itq6wr'
Running gcov: 'gcov /home/carma-streets/kafka_clients/build/CMakeFiles/kafka_clients_test.dir/src/kafka_consumer_worker.cpp.gcda --branch-counts --branch-probabilities --preserve-paths --object-directory /home/carma-streets/kafka_clients/build/CMakeFiles/kafka_clients_test.dir/src' in '/tmp/tmph9itq6wr'
Running gcov: 'gcov /home/carma-streets/kafka_clients/build/CMakeFiles/kafka_clients.dir/src/kafka_consumer_worker.cpp.gcno --branch-counts --branch-probabilities --preserve-paths --object-directory /home/carma-streets/kafka_clients/build/CMakeFiles/kafka_clients.dir/src' in '/tmp/tmph9itq6wr'

There are actually 4 .gcda individuals run of gcov invoked by the gcovr script that generate the #home#carma-streets#kafka_clients#src#kafka_consumer_worker.cpp.gcov report.

Running all of them individually will show you that only on of these runs features significant percent of kafka_consumer_worker.cpp lines executed:

$ gcov /home/carma-streets/kafka_clients/build/CMakeFiles/kafka_clients_test.dir/src/kafka_consumer_worker.cpp.gcda --branch-counts --branch-probabilities --preserve-paths --object-directory /home/carma-streets/kafka_clients/build/CMakeFiles/kafka_clients_test.dir/src
 File '/home/carma-streets/kafka_clients/src/kafka_consumer_worker.cpp'
Lines executed:60.87% of 92
Branches executed:67.63% of 173
Taken at least once:32.95% of 173
Calls executed:43.17% of 183
Creating '#home#carma-streets#kafka_clients#src#kafka_consumer_worker.cpp.gcov'

Another finding, is that every subsequent run overwrites the same .gcov file. Unfortunately, the last run of gcov did not execute any line of kafka_consumer_worker.cpp so the corresponding .gcov file came out dry:

$ gcov /home/carma-streets/kafka_clients/build/CMakeFiles/kafka_clients.dir/src/kafka_consumer_worker.cpp.gcno --branch-counts --branch-probabilities --preserve-paths --object-directory /home/carma-streets/kafka_clients/build/CMakeFiles/kafka_clients.dir/src
File '/home/carma-streets/kafka_clients/src/kafka_consumer_worker.cpp'
Lines executed:0.00% of 92
Branches executed:0.00% of 173
Taken at least once:0.00% of 173
Calls executed:0.00% of 183
Creating '#home#carma-streets#kafka_clients#src#kafka_consumer_worker.cpp.gcov'

So, while gcovr correctly aggregates the multiple files with execution counts in its output table, it carelessly overwrites the detailed .gcov reports without properly merging them, thus misinforming SonarCloud about the execution statistics.

To be able to see correct coverage numbers on SonarCloud, you need to make sure the corresponding .gcov files (#home#carma-streets#kafka_clients#src#kafka_consumer_worker.cpp.gcov in case of kafka_consumer_worker.cpp) accurately represent the execution statistics.

Let me know if you have further questions.

Hello Arseniy,

Thank you so much for your answer. I do have some additional questions. You mention that gcovred does not correctly aggregate the coverage in multiple .gdca files and instead overwrites the results of each. In the example you showed there are 3 gdca files along with a .gnco. The first one looks like it is in reference to coverage for the test_cpp file so that directory can probably be excluded. The second looks like it is related to the C++ library we create so maybe that can also be excluded. The last one is a gnco file. It seems unlikely that there are things super unique about our C++ project/code coverage setup so if what you are saying is true, this would be true for almost any C++ project attempting to use code coverage results using gcovr. Are you saying the solution to this issue would be to only run gcovr with the one gcda file and attempt to exclude the rest? Would this have an impact on the other static metrics that sonar scanner reports analyze?

It looks like you are using gcovr version 3.4 which was released over 4 years ago. Can you upgrade to a more recent version?

If you upgrade to v4.2 or v5.0 you will be able to also try the gcovr --sonarqube output that produces a generic XML coverage report consumable through the sonar.coverageReportPaths config setting.

Alternatively, since you are already using CMake you might find it easier to skip the gcovr step and gather the coverage information using a custom CMake target. See an example.

As for your questions:

  1. Unfortunately, I cannot tell you for sure, but it does seem that there are multiple ways to configure coverage reporting and multiple challenges, so your project might well be experiencing one of them.
  2. What other metrics specifically are you referring to? From the top of my head, changing the source for the coverage statistics affects only that - the coverage statistics, and depending on your SonarCloud configuration might affect the quality gate status.

Hey Again, I think we figured out how to upgrade to gcov 5.0. Is there a special file extension we need for sonar-scanner to pick up the reports? Is there any sonar-properties edits we need to make to indicate the file type to look for?

That’s great news!
With gcovr 5 you have two options:

  • use the -k key as you’ve been doing so far and point the sonar.cfamily.gcov.reportsPath to the folder containing the .gcov files, or
  • use the –sonarqube parameter and then point the sonar.coverageReportPaths to the output folder containing the generic xml coverage reports in the SonarQube format.

The sonar.cfamily.gcov.reportsPath and sonar.coverageReportPaths properties are mentioned in the SonarCloud docs.

Please, let me know if any of these options worked for you.

So I have tried with the global sonar.coverageReportPaths property and I get the following log:

14:10:31.591 INFO: Parsing /home/carma-streets/kafka_clients/coverage/coverage.xml
14:10:31.643 INFO: Imported coverage data for 0 files
14:10:31.646 INFO: Coverage data ignored for 12 unknown files, including:
14:10:31.646 INFO: Parsing /home/carma-streets/message_services/coverage/coverage.xml
14:10:31.656 INFO: Imported coverage data for 0 files
14:10:31.657 INFO: Coverage data ignored for 31 unknown files, including:
14:10:31.657 INFO: Parsing /home/carma-streets/intersection_model/coverage/coverage.xml
14:10:31.663 INFO: Imported coverage data for 0 files
14:10:31.663 INFO: Coverage data ignored for 22 unknown files, including:
14:10:31.663 INFO: Parsing /home/carma-streets/scheduling_service/coverage/coverage.xml
14:10:31.671 INFO: Imported coverage data for 0 files
14:10:31.671 INFO: Coverage data ignored for 16 unknown files, including:
14:10:31.671 INFO: Parsing /home/carma-streets/streets_utils/streets_service_base/coverage/coverage.xml
14:10:31.673 INFO: Imported coverage data for 0 files
14:10:31.673 INFO: Coverage data ignored for 13 unknown files, including:
14:10:31.673 INFO: Sensor Generic Coverage Report (done) | time=82ms

I am not sure if this is because our sonar project is divided into modules and it doesn’t know how to map these coverages back to the module specific source files (carma-streets/ at 115-streets-service-base-lib · usdot-fhwa-stol/carma-streets · GitHub). I also tried to use a module specific <module-name>.sonar.coverageReportPaths but that did not work at all. Is there anyway I could get sonar scanner to be able to map coverage back to the sources defined by each module in my file. Thanks so much for the help so far by the way.

Also are you saying that the issue we have with gcovr 3.4 (overwriting gcov files to zero coverage) should not be an issue in 5.0