Issues being report from old code (outside the New Code threshold)

Our scan on main branch is showing issues for code that has not been touched in a long time, which is causing the quality gate to fail. No configuration was changed and things were working correctly on previous scans.

We use “new code” thresold of 30 days:

And today the latest scan on main branch started showing many issues that were in files not touched for many months:

here is an example of a file that has not been touched recently:

you can see from git commit activity that the last changes were a long time ago:

And even in the sonar UI for the file it is not showing the code as “new” (it’s not highlighted yellow, based on the documentation for “new code”)

I do see this warning on the scans results:
image
But we have always been doing a shallow clone and have not had issues on previous scans.

Anyone know what I may be missing on the configuration / if something was updated in sonarcloud that might be causing issues with “new code” detection?

Hey @caleb.chenoweth

We have a hunch about what’s going on, and it relates to this change the other day:

SonarCloud is usually quite good at recognizing new files and making sure the issues that are raised get backdated properly. But when a file has been indexed by SonarCloud but not analyzed, this doesn’t work. So when we introduce a change that expands the scope of what files are analyzed in the set of previously indexed files, issues aren’t backdated.

This obviously isn’t ideal, and there’s no reasonable workaround to recommend right now. It comes in other flavors as well (when a first analysis is run without NodeJS installed, JS/TS/CSS files are indexed but not analyzed. When this is fixed, those files aren’t considered new and issues aren’t backdated).

I’ve flagged this internally to see what, if anything, should be done next… and, a lot of us are about to be on holiday. So bear with us. Thanks for raising it here.

1 Like