PR analysis compare always scans files in compile_commnds.json -- due to cache miss

We want a PR to only scan changed files in a PR in our CI, jenkins based. I implemented this to some degree following this, Pull Request analysis it notices changed files, like a .cpp source file change in my feature branch and will scan it. My problem is that even when we dont change any source files, it still scans all the files if finds in the compile_commands.json. We use C+ code and as I understand we need to read that compile_commands.json for sonarQube to do a proper analysis.

This is the code that I used to initiate the PR analysis, which I see is working up until it reads the compile_commands.json.

            -Dsonar.pullrequest.key=${env.CHANGE_ID}
            -Dsonar.pullrequest.branch=${env.CHANGE_BRANCH}
            -Dsonar.pullrequest.base=${env.CHANGE_TARGET}

The debug output I get is something like this for every file it finds in the compile_commands.json.
DEBUG Cache miss for …/AlphaMaskCreator.cpp: cache is empty

One thing we are working on is adding Versioning, but I dont think that is enough.
Also I have read about trying to use a peristent .scannerwork file , but not sure exactly what to do here, maybe store the .scannerwork from the main branch in artifactory and pull it on all PR’s?

Thank you very much for any help !!

Also, ill add the sonar-project.properties

sonar.projectName=@REPONAME@
sonar.projectKey=com.blah.blah:@REPONAME@
sonar.sourceEncoding=UTF-8
sonar.language=c++
sonar.cfamily.gcov.reportsPath=./build
sonar.cfamily.compile-commands=build/compile_commands.json
sonar.sources=.
sonar.tests=.
sonar.analysis.cache.enabled=true
sonar.test.inclusions=**/tests/*.cpp'
sonar.coverageReportPaths=out/coverage/sonar-coverage.xml
sonar.exclusions=**/tests/**,**/data/**,build/**,out/**,logs/**,doxygen/**,apps/**/*_example/**,apps/VPM/**,utilities/algo/OnlineAlignmentBenchmarkTool*/**,**/PlaybackUsingCli/**,**/resources/**,**/Dockerfile,**/*.py,**/*.java,**/*.yuv,**/*.txt,**/*.vsdx
sonar.cfamily.excludedIncludes=/tmp/conan*/**,.conan/**,**/.conan/**,**/conan/data/**,**/opencv2/**,**/opencv4/**,**/boost/**

Has your target branch been analyzed on SonarQube recently? From the docs:

  1. Before an analysis, the SonarScanner downloads from the server the corresponding cache:
  • For a branch analysis: the cache of the branch being analyzed.
  • For a pull request analysis: the cache of the target branch.
  • Or, as a fallback, the cache of the main branch.

    Branches that are not scanned for more than seven consecutive days are considered inactive, and the server automatically deletes their cached data to free space in the database.

If so, and you’re still stuck, I’d suggest you share the DEBUG log output of an analysis with 100% cache misses (sonar-scanner -X). That should reveal if any cache is being downloaded at all, and potentially why/why not.

Thanks Colin I will add the -X flag , right now its only set to verbos=true and we dont get the download internal data. We recently implemented versions and are verifying we have scans on all relevant branchs. Ill get back with some more information soon. Thanks

Just want to conclude that this is now working. Our 20 minute scan is now reduced to about 1-2 minutes. We implemented versioning and scanning on the main branch and used the settings I showed above. The main issue was that the cache that was referenced in the compile_commands.json file we not pointing to a consistent location on the nodes. Once we established a consistent path where each docker container running on a node had access to the cache it could then see that no changes had occured. Also we verified those setting for PR would scan source files only if they changed. So having the PR target( main ) branch scanned on each push , plus the fix for the compile_commands.json and those flags got it working. The debug addition helped. Thanks Colin for your support.

1 Like