Hello! Is it possible to upload the coverage report from llvm-cov to SonarQube without using the build wrapper? I’m trying to use it with Bazel in Docker, but unfortunately when I set the flags I found that are needed (--spawn_strategy=standalone --genrule_strategy=standalone) the build is just failing right after configuration. A possible reason for this is warning from Bazel ignoring LD_PRELOAD in environment which seems to be generated in current Bazel version whenever LD_PRELOAD is set.
The coverage.txt format is along the lines of:
/company/aggregation/include/aggregation/aggregation.h:
1| |#pragma once
2| |
3| |#include "aggregation/device_context.h"
there are some non-absolute paths though, not sure if it could be a problem:
Hi @mpaladin . There’s no meaningful error, just that the build has failed. The Bazel version I use is 1.2.1. I also tried converting the compile_commands.json with the Python script provided here: https://jira.sonarsource.com/browse/CPP-1428 . Unfortunately Sonar Scanner is still lost with the includes.
@mpaladin I switched to more verbose log. It seems the build is failing on one of our rules to build librdkafka. All is fine without those --spawn_strategy=standalone --genrule_strategy=standalone.
ERROR: /tmp/dev/.cache/bazel/_bazel_nexthink/45687a9fc51e3180e3ee6f957bfa2b6a/external/librdkafka/BUILD.bazel:16:1: Executing genrule @librdkafka//:generate-config failed (Exit 1)
cp: cannot create regular file '../../bazel-out/k8-opt/bin/external/librdkafka/config.h': No such file or directory
What do you mean with “Sonar Scanner is still lost with the includes.” ? What is the symptom?
We don’t support C++20, the analyzer is going to fail when trying to analyze a file with -std=c++20.
This may be some configuration issue of your build after setting --spawn_strategy=standalone --genrule_strategy=standalone..
I can see that you have ${DOCKER_SONARQUBE_DIR} environment variable, are you trying to use the docker sonar-scanner image? For your information it doesn’t support C/C++ analysis, the analysis needs to run at the same layer of the build to properly resolve include files. This may explain why you see something strange when converting compile_commands.json file. You should use the normal sonar-scanner instead.
in Sonar-Scanner debug log there are issues with includes:
...
08:16:06.781 DEBUG: [/company/aggregation/include/aggregation/module/application_context.h:10]: cannot find include file '#include "somelib/Include/KAssert.h"'
08:16:06.783 DEBUG: [/company/aggregation/include/aggregation/module/application_context.h:11]: cannot find include file '#include "somelib/somehelpers.h"'
08:16:06.784 DEBUG: [/company/aggregation/include/aggregation/module/application_context.h:12]: cannot find include file '#include "some-proto/proto/events.pb.h"'
...
and at the end
08:16:11.583 INFO: Executing post-job 'Final report'
08:16:11.583 WARN: Preprocessor: 874 include directive error(s). This is only relevant if parser creates syntax errors. The preprocessor searches for include files in the with 'sonar.cxx.includeDirectories' defined directories and order.
08:16:11.583 WARN: Source code parser: 182 syntax error(s) detected. Syntax errors could cause invalid software metric values. Root cause are typically missing includes, missing macros or compiler specific extensions.
08:16:11.585 INFO: Analysis total time: 12.373 s
08:16:11.587 INFO: ------------------------------------------------------------------------
08:16:11.587 INFO: EXECUTION SUCCESS
08:16:11.588 INFO: ------------------------------------------------------------------------
08:16:11.588 INFO: Total time: 54.659s
08:16:11.639 INFO: Final Memory: 9M/40M
08:16:11.639 INFO: ------------------------------------------------------------------------
It will fail when encountering a C++20 feature or just every file that was compiled with this flag? More importantly in my particular case, will I be able to have the coverage % in SonarQube nevertheless or it’s a lost cause until SonarQube supports C++20?
It’s a regular sonar-scanner, not an image. Although the entire process (both Bazel and scanning) is being run inside a development container, so I guess it should be fine?
Yes, they are available after the compilation, though in various places around the container. But the Python script output should handle that, I guess? Some lines of build-wrapper-dump-json: