Leak Period Baselined with Custom Date considering old issues along with new

SonarQube Version: 6.7
Sonar Scanner:
Plugin: sonar-javascript-plugin-

Problem Statement: Leak Period baseline with custom date considering old and new issues together

We have setup SonarQube with above mentioned version and set New Code Period as 2019-04-01. Completed the analysis of the project and set as baseline for subsequent analysis. During this analysis A.js file reported 2 issues and B.js file no issue.

As per my understanding during subsequent analysis the old issues should not be reported. During next analysis below are observation after the baseline date:

  • File B.js changed and got introduced an issue which is reported as issue in the next run
  • File A.js also changed with new lines of code, after the analysis it reported 3 issues (2 old issues and 1 new issue)

As per my understanding for the File A.js it should report 1 issue but its showing 3 issues.

Required help in resolving the above issue.

Venkat Thota

Hi Venkat,

Old issues are never going to just go away. Until you fix or dismiss them (or remove the rule from your profile) they’re still going to show up in SonarQube. The crux is whether they’re reported as new issues or old issues. I doubt that question is terribly relevant to your scenario tho.

On a side note, SonarQube 7.9.1, the latest version and current LTS, has far more accurate issue backdating than you’ll find in 6.7, so you probably want to upgrade at your earliest convenience. But TBH, I don’t think even that upgrade will have much impact on this situation.


Hi Ann,

Thanks a lot for quick reply.
I agree that the old issues will not go away and needs to be addressed.

Regarding, “*the crux is whether they’re reported as new issues or old issues. I doubt that question is terribly relevant to your scenario tho.*” … in this case they are getting reported as “new” issues.

This above is relevant to our scenario as we have recently started adapting the sonar and fixing all the legacy issues will be difficult at this point. At this point our focus is not to introduce any new issues. We have decided on a cut off date for the old code base i.e ex:01-May-2019. Now the expectation is any issues reported on new code to be fixed. In this case any new lines introduced to the legacy code reports along with old issues.

Hi Venkat,

How about a screenshot of a new/old issue that includes both the issue date and blame data. E.G.


I have attached here the sample data for your quick reference:

  1. List of issues reported in Old and New Period where both issues numbers are same for a file.

  1. Upon Drill down refer to the method which did not go through any changes during the last change:

This issue was reported during baseline time ex: 01/May/2019

  1. The method underwent change around July 26th 2019

As per my understanding the No 2 should be reported as old issue and No.3 should be reported as last month. Here both showing as it underwent changes. So as per observation if any file underwent changes it will do the analysis of the file and re-reports old and new issues as new.

Hi Venkat,

The ‘last month’ you’re circling in your screenshots is an approximation. You can click on it to get the issue’s exact creation date. However, your screenshot does I think demonstrate what’s going on. You chose to get blame data from the one line that’s marked new in the screenshot (yay!) and it shows a change date of July 26, which is probably the date you’ll see if you click on “last month” to get the issue change log.

What you’re dealing with is a Cognitive Complexity issue. That rule raises its issues at method level based on what happens inside the method. It looks like on July 26 just enough complexity was added inside the method to bump the whole method above the Cognitive Complexity threshold set in the rule. And that’s why you have a new issue raised on apparently old code. Does this make sense?


Hi Ann,

Thanks for the message.

I have done further analysis by setting up sonar environment in my machine with same configuration. Did a first run on 6th Sep and made it as baseline. The next analysis on 9th Sep by changing code and below are my observations for two rules:

  1. Rule “Removed unused function parameter”. During first run on 6th Sep couple of issues are reported. Before second run on 9th Sep both issues reported time as 3 days ago. Changed the code to report new issues and did the analysis. The new issues are reported with time as 7 minutes ago etc


The above new issues showed up in dashboard under “New code” widget. Working as expected.

  1. Rule “Cognitive Complexity…”: During the first run on 6th Sep three issues are reported. Before the next analysis on 9th Sep the timeline displayed as “3 days ago”. Did some changes to increase the complexity and rerun the analysis on 9th Sep couple of times. The timeline still shows as “3 days ago” on cursor overlay the additional details are displayed.


Below are my expectations in the above case:

  • Observed “New Code” widget there is no increase in the count, should have reported as 1 new issue
  • The new timeline “10 mins ago” should have displayed rather 3 days ago

am I missing anything here?


Hi Venkat,

I appreciate your dedication to working through this. :slightly_smiling_face:

In fact, the Cognitive Complexity issues are also working as designed (if not as expected :wink:). The issue in question was originally raised 3 days ago. Then you added additional complexity - you made the existing and already noted - problem worse. The issue change log reflects that with the increase in effort.

This is how threshold-based rules work in general. The first time you cross the threshold an issue is raised. As long as you stay across the threshold, that same issue remains open and the effort and probably the message are adjusted if there are changes. To do the opposite - close the old issue and open a new one because how far you’re across the threshold has changed - would be contrary to most people’s expectations.

For instance, let’s take your issue with a Cognitive Complexity of 26 and say that it’s ‘old’ - out of the New Code period. Let’s say I have to make changes to the method for my current work ticket. I don’t have time to do a complete fix, but I do make it better and reduce Cognitive Complexity to 18. If what you’re expecting happend, my CC=26 issue would close and I would be “punished” for my improvement with a new, CC=18 issue being opened in the leak period. Instead, SonarQube notes that the situation was improved by updating the effort and message and keeps the issue out of the New Code Period.

Make sense?


Thanks for immediate reply :slightly_smiling_face: and really appreciate your focus on helping members. Even I agree with you on working as designed case :slightly_smiling_face:

The issue i am facing here where we want to focus on not introducing new issues on the new code and fix the new issues report. So the reason baseline is defined (set specific date for project) and planned to focus on any issue reported. In the above case if the cognitive complexity of the modified is not reported under “New Code” it will be missed as we have larger code base where it will be difficult to identify the updates. .

Hi Venkat,

As I was writing my earlier reply, I realized that in the case where the complexity increases on an already-open issue you probably do want a new issue. But that’s just not the way it works.



Thanks for providing detailed information and it helped me in understanding concepts.