C++ and cmake: Running sonar scanner on compile_commands.json


We’re using both sonar scanner and cppcheck on C++ code. Sonar scanner requires the build-wrapper to wrap the build and listen to compiler commands which cppcheck can run based on compile_commands.json which CMake can generate without having to do a full build. This means that we can run cppcheck in parallel with the build while sonar scanner must wait for the build to complete.

We’re now wondering if it would be possible to have sonar scanner support reading compile_commands.json instead of needing to wait for the build to complete? This would bring it up to part with cppcheck in this respect.

Details on the compile_commands.json can be found here:

All help much appreciated.

Hi @tern-nils,

Running sonar scanner on compilation database is something we are considering.
We have a ticket for that:
As of today, I cannot give you any timeline for this but at least, you can follow the progress.

In the meantime, the only thing I can suggest to minimize the overhead might be to use ccache to significantly speed up your build.

I hope this helps.

Hi @Geoffray,

Thanks for the response. We would be very happy to see this happen. It would allow parallelizing our jenkins pipelines and reducing build times quite a bit. We already use the cache fortunately - but not being able to parallelize is a real weakness in sonarqube compared to other tools we’ve used.



Am I right to deduce from ticket https://jira.sonarsource.com/browse/CPP-1428 that the “build-wrapper” will not need to rebuild the C++ code anymore if we use the proposed “workaround” :
a python file which translates the compilation database (i.e. the file compile_commands.json) into the required compilation properties (compiler flags/arguments/includes locations) ?

It would be such a time-saver for our teams :
Our teams of C++ developers (using VSCode and CMake) would be alerted about their bugs shortly after the commit, instead of the next morning !

It would greatly increase the added value of SonarQube, in the eyes of both the developpers and the managers.

1 Like

Hi @cyril.og!

You are indeed right. If you can accurately generate the compilation database, convert it and feed it to the analysis, you do not need to build the project inside the build-wrapper.
But, please do note that this is a workaround provided to help non working cases. It is not thoroughly tested and it is not officially supported.
If an official support comes up, the ticket will be updated accordingly.



1 Like

Thanks @Geoffray, this is great news for us : having to recompile our whole project using the wrapper is really decreasing a lot the benefits of Sonar C++ scanner.

Our project is huge, thus it takes a very long time to compile.
So we’re doing incremental builds during the day, and full rebuilds are only done nightly.
=> C/C++ developers have to wait for the next morning to know if Sonar scanner detected issues in their code. If yes, they must redo the testing phase :frowning: + wait for another day for the next scanning :frowning:

To be honest, we are currently investigating other C++ static analyzers than Sonar, so as to provide feed-back to our C++ developers in real-time instead of each morning.

But now there is hope we can solve that issue using your “not-officially-supported-workaround” : it would be a great improvement for us, not just a “workaround” :wink:


Hi @cyril.og

Thanks for providing these additional details. We really value this information.
2 points:

  • as I previously mentioned in the thread, did you consider using ccache? It allows to perform a full build with a speed close to an incremental build. It works well with our analysis. Additionally, even if not using ccache, our analysis cache (described there) would be really valuable in your case.
  • for an official support of compilation database, stay tuned and watch the ticket. I cannot make any firm promise yet but things are moving.
1 Like

Hi @Geoffray

We are already using ccache, and I agree it helps a lot. But it’s not enough.

I asked for activation of your analysis cache as soon as I noticed the warnings about it in the SonarQube reports. And those warnings disappeared since then.

Could you tell me if there was any decision on your side to add official support of compilation database in 2021 or to postpone it until 2022 (or more) ?

Depending on the answer, we may postpone a switch from SonarQube to Coverity at company level.

Hi @cyril.og

Just to be sure we are on the same page here:

  • In many situations you should be able to configure analysis cache so that only the code which is changed compared to the previous analysis is analyzed again. It can bring a massive boost to the analysis speed when properly configured. If misconfigured, there is no warning suggesting configuring analysis cache. There are just cache misses and then low or no speed improvement.
  • ccache when properly configured will compile files only once and will just copy them on subsequent builds. This is cutting the biggest share of the build. Most of the time, the additional potential gain from having compilation database support is most likely much smaller. So, I am not sure what you mean in “it is not enough”. Is there some additional steps in your build that are very heavy?

From there, I am glad to tell you that we decided to add support for compilation database. The implementation work is about to start and I cannot commit on an ETA there but the ticket will be updated. Please follow it.


1 Like

Hi @Geoffray,

We are on the same page :

  • I assume the analysis cache is properly configured , since there are mostly cache hits in the analysis logs.
  • ccache is reducing a lot the compilation time when “rebuilding all”
  • There are indeed additional steps in our build that are heavy, unrelated with compilation or sonar scanner, but currently preventing us to run a full-rebuild at will. That’s why not requiring that full-rebuild would help us a lot.

I am glad for the good news, and I understand very well that you cannot commit on an ETA, but maybe you could tell us if we can hope to have support in at most 3 or 6 or 12 months ? :slight_smile:

Another question :
We also have some C/C++ 32-bits projects which were using SonarScanner, but cannot do anymore because the latest versions of the build wrapper cannot run on 32 bits OSes anymore (please correct me if I am wrong).
I assume that cross-compiling those projects from a 64-bits OS would solve that issue, but it’s not a light and cheap move.
When the compile_commands.json compilation database is supported, will it also solve that issue since it replaces the build wrapper ?

Hi @Geoffray ,

Thanks for the update. This is a very desirable improvement.


Yes, compiling 32-bits code on a 64-bits platform usually works pretty well (in some cases, better than compiling it on the 32-bits platform). When we removed support for 32-bits build wrapper, we assumed that almost all build machines were 64-bits nowadays.

Using compilation database would remove the need for build wrapper, so would solve that issue. However, the code still needs to be analyzed, and the analysis itself is 64-bits only too. And we highly recommend analyzing on the build machine, because this is the only way to be sure that all required files are present at the right place during analysis*. So even with a compilation database, building the code with a 64-bits environment would have some value.

(*) If you analyze on a different machine, and make sure everything is identical, then you can still use an old 32-bits version of the build wrapper with a modern version of the analyzer, the output format is compatible.

1 Like

Hi @JolyLoic,
Thanks a lot for these :sparkles: precious :sparkles: hints. :smiley:

About the last one :

If we check-out the source code in the same absolute path on the 2 machines (the 32 bits one running the build-wrapper, and the 64 bits running the analyzer), is it enough to “make sure everything is identical” ?

Necessary, but not sufficient :slight_smile:
System headers have to be located at the same place on both machines, as well as the headers of all external libraries that are directly or indirectly #included in your code.
In other words, if the 64-bits machine is correctly configured for analysis, it is probably also correctly configured for build, and you might as well be able to build your code on it :slight_smile:

1 Like

Could it be that the newest sonar-scanner version 4.6 is incompatible with this python script’s generated build-wrapper.json?

We have the same need to use compile_commands.json to facilitate analysis disconnected to the build. When I tried the python tool to convert the json commands file to build-wrapper-dump.json, it seemed to work correctly, but then the scanner complained for every .cpp file:

WARN: Invalid probe found, skip analysis of files: [/work/HomeDirectories/…/src/…/File.cpp]
The compiler probe ‘stdout’ is expected to contain at least one ‘#define’ directive:

And finally the scanner quit with the error:
ERROR: Error during SonarScanner execution
java.lang.IllegalStateException: The “build-wrapper-dump.json” file was found but 0 C/C++/Objective-C files were analyzed. Please make sure that:

  • you are using the latest version of the build-wrapper and the SonarCFamily analyzer
  • you are correctly invoking the scanner with correct configuration
  • your compiler is supported
  • you are wrapping your build correctly
  • you are wrapping a full/clean build
  • you are providing the path to the correct build-wrapper output directory
  • you are building and analyzing the same source checkout, absolute paths must be identical in build and analysis steps
    at com.sonar.cpp.plugin.CFamilySensor.execute(CFamilySensor.java:357)
    at org.sonar.scanner.sensor.AbstractSensorWrapper.analyse(AbstractSensorWrapper.java:48)

Looks like the issue was resolved a month ago:

I think we only need a new scanner? No need for new server or new wrapper?

Is it already released? Looks like the latest version was created a few days before:

When can we expect a new version supporting compile_commands.json and whats the property for this?

Hello @KUGA2,

Yes, the issue is resolved! Check the anouncement.

No, it is not related to the scanner, and you won’t be needing the wrapper if you are using a compilation database.

All this is already available on SonarCloud.io and will be available with SonarQube 9.0 starting from Developer Edition.

If you are using SQ you will have to update to 9.0(planned to be released at the beginning of July).
The instruction on how to use it should be available in the documentation once you update your SQ.


1 Like

Hi @JolyLoic and @Abbas_Sabra,
it is great news that SonarQube 9.0 now officially supports compile_commands.json !

But like many large companies, mine only deploys the LTS versions of SonarQube, after a few months of testing. Thus we are currently using SonarQube 7.9 LTS, and we will soon switch to the latest LTS : SonarQube 8.9 LTS released in May 2021.

Since you release a LTS every 18 months, I do not expect to get the benefits provided by SonarQube 9.0 before 2023 :sob:

Meanwhile, will I still be able to use the “not-officially-supported” python script from https://jira.sonarsource.com/browse/CPP-1428 to generate a build-wrapper.json file from thecompile_commands.json ?

Or will it not work anymore with SonarQube 7.9 LTS and SonarQube 8.9 LTS ?

Hi @cyril.og,

I don’t see any reason for that script to stop working.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.