Missing a temporary "subprocess" executable in a c++ sonar scan

We are using sonar in a C++/Go project and trying to plug the C++ coverage and analysis in gitlab-CI.

During the scan we have the error:

java.lang.IllegalStateException: java.io.IOException: Cannot run program "/builds/ourproject/.scannerwork/.sonartmp/11355457993123622329/subprocess" (in directory "/builds/ourproject/build/src/A/CODE/FOLDER"): error=2, No such file or directory

We are using a sonar-project.properties configuration file like





We use gcovr to produce the build/gcov.xml file, after a lcov coverage on the project.
The build wrapper executable from our sonar website’s zip is used to produce the build/build_wrapper_output_directory folder, in a separate job of the CI.

linter.out also comes from a separate job, and is produce by the linter revive on Go code.
cover.out comes from a go test coverage command.
We strongly suppose the Go integration works well and is out of the problem’s scope.

We are rather new on the subject, so our integration maybe the issue.

Sonar version: SonarQube Developer Edition Version 8.9.1 (build 44547)

error log extract: errorlog.txt (5.0 KB)

Thanks for any help!

1 Like

Hello @FabAB

sonar.cxx and sonar.cfamily are two different analyzer that don’t play well together. The official one is sonar.cfamily. You can use sonar.cfamily.gcov.reportsPath to point on the *.gcov report.

This means that during the build this directory /builds/ourproject/build/src/A/CODE/FOLDER exists but not during the analysis. It is a requirement that build and analysis should happen in the same enviroment. Let me know if it makes sense.


Sorry for late answer, I was on holidays. Thanks for your reply. It makes sense, as the scan is done on an artifact retrieved from the build job, only storing the build-wrapper-output folder, but I understand that you say other files are created.

First I replaced the “cxx.coverage” by the official tool “cfamily.gcov”, thanks for the info.

Secondly, I tried to find a folder named .scannerwork during the compilation phase (which is done using the build wrapper). However no folder of that name seems to be present.
I tried to copy the full build folder from the compilation phase in the artifacts to be retrieved by the job with the sonar scan, did not work either (same error).

As we are doing the sonar scan using a different image ( sonarsource/sonar-scanner-cli), how can we be strictly in the same environment? Apart from the build folder (the one for cmake+make), what is needed?

1 Like

I ran into this problem when trying to use difference docker images to do the build vs scan, and found that they need to have the same base image and the compiler (gcc) needs to be on the scanner image, and similar header files are needed for the gcovr step which I found is the only way of getting coverage working.

More info about my analysis at Support sonarcloud.io · Issue #6 · lpenz/ghaction-cmake · GitHub

Thanks a lot for this information, this may help with my issue!

Hello @jayvdb,

You are right this is covered in the documentation:

  • The Build Wrapper collects information about the build including absolute file paths (source files, standard headers, libraries, etc…). Later on, SonarScanner uses this information and needs to access those paths. Whereas this is straightforward while running these 2 steps on the same host, it is worth some consideration when using any sort of containerization. A consequence of this is that C / C++ / Objective-C analysis is NOT supported by SonarScanner CLI Docker image.

Link to the full Doc


@Abbas_Sabra , could you clarify whether the SonarScanner needs the compiler executables to be part of the scanner image?

Which build wrapper output file contains these absolute file paths; what format is it in so that we can inspect it to find the answers to this question and others.

Yes, the “scanner image” should be identical to the image that was used to build the code, right after the build was done:

  • During analysis, even if we are not compiling any file, we are still executing the compilers to gather information on their configuration
  • Many build processes generate source code that is required to understand the rest of the code.
  • All paths have to be identical and point on the same files, including for 3rd party libraries.

So I put “scanner image” between quotes, because in fact, there should not be a “scanner image”: There should be a unique image that is used for the build and for the scanner.

Hope this helps,

I am trying to determine the details of what are needed, because I do have different images for build vs scan. I am a paying customer and I must have them as separate stages. And I have that part working, so telling me it doesn’t work or isnt allowed isn’t helpful.

That said, it is helpful to know that the compiler needs to be consistent between both stages, and you’ve explained why, and it is reasonable.

I can also understand why all the 3rd party libraries might be needed. I haven’t encountered this, but maybe my scanner results are less comprehensive as a result. I’ll investigate that a bit.

Hello @jayvdb,

The idea here is that we recommend that it is on the same image because we need to access all the files that were accessed during the build and to do that, we use absolute paths: standard libraries, generated files, compilers.

I understand that you are using different images, and it is working; it works if you are carefully crafting it. It is just that we don’t recommend that because it is harder to maintain. For example, Different compilers versions between the image might go unnoticed and might impact the quality of the analysis silently.
Also, we don’t guarantee that we will not need to access more things after an update. So it is less future-proof.

Anyway, If you decide to go with different images, please run the scanner in verbose mode and make sure that there is no warning about the analysis environment, like files not being found. This will make sure that the quality of the analysis is not impacted by the use of different images.


1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.