How does New Code actually get calculated?

Currently we use SonarQube Version 10.2.1 (build 78527).

For our Swift projects it works perfectly, until we discovered some weird Coverage percentages for merge requests. How does the calculation actually work and what is considered new code, for example, we don’t have a project version defined anywhere.

Our global settings seem to be this:

Previous version
Any code that has changed since the previous version is considered new code.
Recommended for projects following regular versions or releases.

And when we open a merge request and then do a change again, is the version the last commit or what?
Our Coverage can go from 80% to 1% real quick after a new commit inside the merge request.

Do we have to change the settings to:

Reference branch
Choose a branch as the baseline for the new code.
Recommended for projects using feature branches.
The branch you select as the reference branch will need its own new code definition to prevent it from using itself as a reference.

To get better results for a merge request?

We use SonarQube Decoration, thats why we see those percentages drop sometimes when commiting new things.

Hey there.

For merge requests – the changed code is defined by what data is collected by SCM (and be all the code changed in pull request, compared to the target branch.

  • If you do any sort of shallow cloning – this data can get messed up.
  • In what environment are you running the merge request analysis? What CI / DevOps platform?

Thanks for the reply!

We use GitLab and Fastlane, so for SonarQube Analysis we use their slather and sonar actions.

After you mentioning shallow cloning I have found this warning:

WARN: Shallow clone detected, no blame information will be provided. You can convert to non-shallow with ‘git fetch --unshallow’.

And for SCM it prints me a lot, but I guess this is the most important:

INFO: SCM Publisher 0/68 source files have been analyzed (done) | time=2ms

This Merge Request, for example, includes 8 commits, where 7 of them are made after opening the Merge Request, which means the pipeline, and sonarqube analysis, will run 8 times.

Even more, at the bottom of the SonarQube analysis job is this:

INFO: More about the report processing at …

After looking at that for the first commit and the last, I have found this:

"warnings": [
      "Could not find ref 'develop' in refs/heads, refs/remotes, refs/remotes/upstream or refs/remotes/origin. You may see unexpected issues and changes. Please make sure to fetch this ref before pull request analysis and refer to \u003Ca href=\"\" rel=\"noopener noreferrer\" target=\"_blank\"\u003Ethe documentation\u003C/a\u003E.",
      "Shallow clone detected during the analysis. Some files will miss SCM information. This will affect features like auto-assignment of issues. Please configure your build to disable shallow clone.",
      "Missing blame information for 68 files. This may lead to some features not working correctly. Please check the analysis logs and refer to \u003Ca href=\"\" rel=\"noopener noreferrer\" target=\"_blank\"\u003Ethe documentation\u003C/a\u003E."

And the only thing different there for both is the Missing blame information for X files.

I also found this at every single job at the beginning:

Fetching changes with git depth set to 20...
Initialized empty Git repository in /Users/gitlab-runner/UMT_3R5p/1/XXX/

Hey there.

I’m not familiar with fastlane at all – but you need to make sure a full git clone is happening (git fetch --unshallow).

What options do you have to modify the git clone behavior in fastlane?

Sorry for the late reply, Germany had a holiday :smiley:

Some of our pipeline jobs have a git fetch script command running before doing anything else.

I will add the --unshallow parameter and do some testing, if this fixes the problem, I will come back and mark your comment as solution!

1 Like

It seems that changing the --unshallow with a script, didn’t help, but changing the git strategy inside the CI/CD settings on Gitlab to use git fetch rather than git clone, at least fixed the warning here.

There were a lot of other things going wrong in our project, so a lot of things had to be done from our side, we hope it’s now fully fixed.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.