Quality gate fails unexpectedly for old code

I have setup our workflow to include sonar as follows:

  • change are made on a feature branch

  • a pull request is used to merge them to the development branch which is covered by a sonar quality gate

  • To make a release we create a pull request from the development branch to the master branch.

  • I have also added the same sonar quality gate to master.

The issue that is puzzling me is that I can make a single feature change and it will pass the quality gate for merging on to development but then it fails the quality gate for merging onto master
when in fact there is no further change.

The new code definition is at 30 days.
And yet the quality gate fails due to too many code smells for issues which sonar itself dates as ā€œ3 years agoā€. The last commit to the files was also a few months ago.

The behaviour I want from this setup is that a merge from development to master should only fail the quality code in the unlikely event that a problem is caused by the combination of two or more changes accumulated on the development branch being applied to master. That is clearly not the case here.

ALM used: Azure DevOps
CI system used: Azure DevOps
Languages (in this case): go

I think I have observed this problem for other projects I have onboarded for the first analysis of the master branch which may be significant. However, this is not the first analysis for this projects master branch.
My workaround has typically been to fix the quality issues even though they are not new. However, I have to onboard a large number of projects now so instead I find I have to make the quality gate optional in Azure DevOps and basically ignore the failure. This defeats the objective of having it.

Hi,

Can you provide some details?

First, are we talking purely about new issues raised on old code, or are there things like coverage and duplications in the mix?

And if itā€™s about issues, can you give some concrete examples?

Ā 
Thx,
Ann

In this case it was code smells raised on old code.
Our quality gate requires an A rating for new code but got a C because 86 code smells were found. All of these issue relate to code ā€œ3 years agoā€ according to sonar.
The break down by rules is:

(Go) String literals should not be duplicated 69
(Go) Track uses of ā€œTODOā€ tags 13
(Go) Cognitive Complexity of functions should not be too high 2
(Go) Functions should not have identical implementations 1
(Go) Track uses of ā€œFIXMEā€ tags 1

Interesting under the creation date filter under ā€œnew codeā€ I see a histogram:

47 issues - 2019
36 issues - 2020
0 issues - 2021
3 issues - up to Sept 2022

So the question is why are issues from 2019 and 2020 being counted as new code?

Under the admin menu the new code definition is clearly set to 30 days.

Hi,

Thanks for the details. Sometimes changes in new code can cause issues to be raised on old code.

As an example, a new ā€˜null pointer dereferenceā€™ may be raised on old code if I delete (or invalidate) the null-check before the dereference. Similarly, if I add a string literal use, that could easily cause a new ā€œduplicated string literalā€ issue on the (old) first use of the literal. It would be a similar story for ā€œfunctions should not have identical implementationsā€.

Same for Cognitive Complexity; that rule raises an issue on the (presumably old) method declaration rather than on the new code that bumped the method over the limit.

The ā€œTODOā€ and ā€œFIXMEā€ are harder to explain away, though. Would you mind sharing a screenshot of one of these issues?

And maybe also a screenshot showing your failing QG conditions and one of your QG details?

Ā 
Ann



Here is a link to the project itself if you are able to see it:

https://sonarcloud.io/project/issues?resolved=false&sinceLeakPeriod=true&types=CODE_SMELL&pullRequest=213671&id=km-eng-br-rtmvodapi-go&open=AYNdJdmvUqn1Vf8T9iJb

Hi,

Thanks for the screenshots and the link. Unfortunately, Iā€™m not able to access the project, but thatā€™s probably fine.

What Iā€™m not understanding is that in the context of a Pull Request, youā€™re seeing issues marked on non-new code Yes, I know thatā€™s what you said to start with. :slight_smile:. I didnā€™t understand that the context was the pull request. I thought you were seeing the behavior after merge.

Can you share the analysis log for this PR? I suspect somethingā€™s going wrong retrieving the blame data - which is whatā€™s used to identify ā€œnewā€ lines.

Ā 
Ann

sonarlog40.zip (586.4 KB)

Hi,

Thanks for the log. Nothingā€™s jumping out at me, so Iā€™m going to flag this for more expert attention.

Ā 
Ann

Here is the git blame for the smell I showed in the screenshot:

3bf93102 (Bruce S O Adams 2019-07-16 17:44:52 +0000 921) // @todo blah redacted blah
3bf93102 (Bruce S O Adams 2019-07-16 17:44:52 +0000 922) /*

So you can see it is 3 years old / 2019 and not ā€œnew codeā€ in that sense.

1 Like

I am having a possibly related issue with another project but for this one I notice the azure dev-ops runs:

git remote add origin somerepo/somewhere
git config gc.auto 0
git config --get-all http.somerepo/somewhere.extraheader
git config --get-all http.extraheader
git config --get-regexp .*extraheader
git config --get-all http.proxy
git config http.version HTTP/1.1
git --config-env=http.extraheader=env_var_http.extraheader fetch --force --tags --prune --prune-tags --progress --no-recurse-submodules origin --depth=1 +0eca4ea1d6dc319026d0ea0c29ae083589aac994:refs/remotes/origin/0eca4ea1d6dc319026d0ea0c29ae083589aac994

And sonar includes warnings for the analysis:

Could not find ref 'develop' in refs/heads, refs/remotes/upstream or refs/remotes/origin. You may see unexpected issues and changes. Please make sure to fetch this ref before pull request analysis.

Shallow clone detected during the analysis. Some files will miss SCM information. This will affect features like auto-assignment of issues. Please configure your build to disable shallow clone.

For this case I have not yet configured the sonar project (due to a permissions issue). That includes setting it up to talk to azure devops which presumably allows it to grab the extra git information it needs
For the project failing we have been discussing the git step is:

git remote add origin somerepo/somewhere
git config gc.auto 0
git config --get-all http.https://somerepo/somewhere.extraheader
git config --get-all http.extraheader
git config --get-regexp .*extraheader
git config --get-all http.proxy
git config http.version HTTP/1.1
git --config-env=http.extraheader=env_var_http.extraheader fetch --force --tags --prune --prune-tags --progress --no-recurse-submodules origin

As discussed previously sonar seems to report the correct blame information but perhaps it gets that separately from the analysis of a particular build?

This new project does not have the correct blame info yet and seems to think the entire codebase is new. I am hoping it will magically correct itself once I have configured the project correctly.

I now have the permissions issue resolved but the new code definition isnā€™t code.

For one project I have 9.8K lines of new code for a pull request where the main change Iā€™ve made are to alter the build to perform sonar analysis. If I run the sonar analysis manually it says much more plausibly that Iā€™ve changed 157 lines (I altered some code to improve other quality gate metrics).

I canā€™t see a way to fix this. Please help.

The shallow clone part of the issue is microsoftā€™s fault - steps.checkout definition | Microsoft Learn

New pipelines created after the September 2022 Azure DevOps sprint 209 update have Shallow fetch enabled by default and configured with a depth of 1. Previously the default was not to shallow fetch.

This caught me out due to having starting onboarding some projects before the change and some afterwards.

The solution to that is in the azure-pipelines.yml:

    steps:
      - checkout: self
        fetchDepth: 0

The default fetch depth was previously zero and has been changed to 1 (shallow).
Iā€™m not clear how this could have been done without retroactively affecting existing pipelines.
As existing projects will not have this entry. Azure perhaps has hidden defaults for projects?

This leaves me with the wrong blame information issue I initially described. I will attempt adding fetchDepth to that project to see if it helps.

2 Likes

Hi Bruce

Thanks for keeping us in the loop. I can verify that this shallow clone is blocking us to deliver the right blame information.
Please let me know your further findings if applying the fetch depth flag helped.

Thanks
Csaba

The newer project provokes a warning about a shallow clone from sonar whereas the other didnā€™t.
Are you able to discern from your side whether there are cases where an analysis could fail to produce such a warning?

Unfortunately I canā€™t easily restart the merge that demonstrated it as it has already been completed.
It might be possible if I create a clone of the project and undo the change and try to reapply it. It requires a bit of jumping through hoops and time which I may or may not be able to justify.

The default for azure was a deep clone when the project was first onboarded but it is not impossible that it became shallow for the pull request. I would expect the sonar analysis to have detected that however.

Thanks @KantarBruceAdams for getting back to us and raising awareness of this change. We have created a follow-up task to update the guidelines on our end and be specific on this setting.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.