Running SonarQube 6.7.5.38563 (LTS) and attempting to utilize the Quality Gates to flag issues on new code, and not force teams to immediately go back to fix older issues (Old codebase, lots of issues).
Running SonarQube analysis from TFS build.
What I see happen, I create a new project in Sonar from a baseline build of our Develop branch… later a branch is taken from develop, and is built… it will succeed and pass the quality gate… later a change to that same branch will happen, and the quality gate will FAIL with dozens of issues, with code that hasn’t changed in months or years… I thought that the quaility gate against the leak period would allow us to ignore OLD issues and focus on NEW issues, but that doesn’t seem to be happening, and I don’t understand how those old issues seem to be getting flagged as new issues…
The documentation on the Quality gate itself says:
If focuses on keeping new code clean, rather than spending a lot of effort remediating old code. Out of the box, it’s already set as the default profile.
So, issues will appear as new issues under the leak period as brand new… as they were introduced in the last build … so the date will show “9 minutes ago”… however that code is not new… in-fact I will look at the git history on that file and it hasn’t been modified in over a year… (Sonar will even list the name of the person who made the change… and it will be somebody who hasn’t worked here in 9months) It’s as if Sonar has forgotten about ever seeing that file before, and now it wants to say it’s a new file… I’ve changed the leak period to a specific date instead of “previous_version” to help… and it has helped a little with a couple of projects, but not with all… I’d like to point to some bad branching practices by some team members as to the root cause, but it doesn’t always make sense… and I don’t have a reason why files that haven’t changed in over a year are getting flagged in a build of a branch that was created a couple days ago.
I have reset the quality profile after the very first build, to our custom profile (which really just has a couple minor adjustments). But, I will see “good” builds after the adjustment, only to see bad builds later, for bugs that are now “new” again, for rules that were not changed.
You changed profiles after the first analysis. Does your custom profile include rules that aren’t in the default profile? And if so, do these old-new issues come from those rules?
First, can you double check your Activity page and make sure you only have one Quality Profile event - the one where you switched to your custom profile? In 6.7.*, Issues are backdated on first analysis and in some other cases, but not generally. If a second issue ran (for instance) with an empty profile and closed all issues, subsequent analysis that re-found the issues would open them as new. BTW, that’s fixed by 7.4. Now we resurrect the old issue (and thus the old creation date).
Second, is SCM data available to all analyses? Your screenshots cut off the left margin, so it’s not clear if you’ve got blame data (and line change timestamps) for the file. The fact that the issue is unassigned in both screenshots leans me toward some data being missing.
So this just happened on a project, and i don’t understand… You see a couple things, the build at 5:08PM, the number of bugs goes down by nearly 100, and so then later those 100 bugs are returned as new bugs. Also at 5:08 it says its stopping using the Javascript profile… I don’t know what causes that… that was not a manual thing, and we don’t have a custom Javascript profile… Sonar just seems to decide to stop using that profile for that build?
Also, none of the bugs that are lost and found are Javascript bugs… they are all in the C# code… so I don’t even understand why those bugs are lost on that build… stopping using a Javascript profile should not matter for this project.
EDIT — so a little digging in to this instance, that build @5:08 the compile step failed… We are building a C# project, and the Sonar task is set to Integrate with MSBuild… the MSBuild fails…how is that handled?
In general, the first place to go for some of these answers is your analysis log. But your Activity page screenshot is interesting.
The 10:27a analysis stopped using the Sonar way C# profile and started using the Alliant way profile. Whereas your 5:08p analysis simply stopped using the Sonar way JavaScript profile without starting to use some other profile. That means that no JavaScript code was detected during that analysis. If files for the language are found, some profile will be used. The use of no profile means no files. And then at 6:23p the JS files showed back up. You should really check your build/analysis logs to find out what’s going on there. Unless you were adjusting exclusions around that time (I assume you weren’t) to exclude and then re-include JS files, there’s something going on on your CI side.
Regarding the analysis were your C# compile failed, I would have expected that to halt the process altogether without proceeding to analysis. Perhaps your CI logs will be revealing…?
I have the run analysis step ALWAYS running (even after a compile failure…) I’ve found that If I don’t have it that way (and the prepare step runs without a Run Analysis at the end) that that leaves Sonar in a bad state and future analysis won’t be possible without restarting the service… so that is why that is that way…
I can make change the build to not run the Run analysis step, I will see if my “Previous analysis has not completed” issue returns on not… Yes, clean for each build.
This isn’t the only way old bugs seem to return… here is an activity shot of a build where 3 bugs decided to show up as new again, for no explained reason.
That last 6:22 build that goes red failed for 3 “new” bugs… those files had not changed since November or December, but they show as new as of that build.
Let’s focus on those three bugs. Please go to your project’s Issues page and look at Closed issues. You’ll need to expand the Resolution facet & deselect Unresolved, then expand Status and choose Closed
Now use the other facets to find those three issues, and look at their change logs to see if you can identify the close reasons:
It will be interesting to see whether we have this same “Line removed” reason. Unfortunately, that’s the reason that’s recorded whether it’s the line or the file that’s removed, but either way, it will give us some clue as to why the issue was removed.
Where I’m going with this is that I suspect some remaining irregularities in your analysis that may be affecting which files are analyzed. If for some reason files are being excluded intermittently, that would explain why their issues are closed and then re-opened when they’re reincluded.
Just to follow up here… I upgraded the version of Sonar to 7.5 as it was indicated that it better knew how to handle old issues, and indeed we’ve found that to be true. The latest Sonar analysis on this version has not seemed to give us any instances of old issues getting flagged as new issues. thanks