Inconsistent results between runs

We’re using SonarCloud with Azure DevOps build pipelines, and notice we get inconsistent results on analysis of builds.

Possibly the most telling is the lines of code count jumps down and up like a mad thing.

For the most noticeable pipeline, it’s had several builds this month:
December 3rd, 10:08 - 301,968 lines of code; 1,783 classes
December 3rd, 12:48 - 301,968 lines of code; 1,783 classes
December 4th, 12:00 - 301,968 lines of code; 1,783 classes
December 8th, 16:29 - 218,772 lines of code; 1,290 classes
December 9th, 9:36 - 213,167 lines of code; 1,240 classes
December 11th, 12:00 - 302,042 lines of code; 1,784 classes
December 14th, 16:20 - 213,167 lines of code; 1,240 classes
December 15th, 10:30 - 302,039 lines of code; 1,784 classes
December 15th, 15:32 - 302,004 lines of code; 1,784 classes
December 15th, 16:02 - 213,128 lines of code; 1,240 classes

Now whilst I expect a little build on build variance - suddenly losing about 30% of the code base looks to be a massive outlier, especially as it came back pretty quickly (and reviewing changesets confirms no mass refactoring going on here)

Also, reviewing the logs shows nothing obvious to the untrained eye.

It does appear to affect multiple pipelines of ours, and as the code is vanishing, and coming back, we were seeing security hotspots constantly being flagged as new, with all the old comments having been destroyed on the new version of an old hotspot - this does seem like it may have been mitigated with the new security hotspot workflow, but it might just be that the vanishing sections don’t have any hotspots any more :wink:

1 Like

Hi @RowlandShaw, could you please provide us with more information:

  1. This statistical data is related to a single branch? Please provide more details about it (master/main, short living/long living).
  2. How is configured the new code period for the project?
  3. Is the project public, so we can have a look at the pipeline and/or the results at SonarCloud?
  4. Can you generate similar statistics, but related to a date range before October this year?
  1. This is the main branch of one of our products - so not for short lived things like PR builds
  2. Not sure I understand what you’re asking here.
  3. This is not a public project.
  4. We noticed the issue as far back as May when we started with SonarCloud with this project, but the spurious results seem to have been smoothed out of the graphs, so its slightly harder to spot :frowning:

You can get the new code configuration at https://sonarcloud.io/project/new_code?id=YOUR_PROJECT_KEY (replace by your project key).

Could you please provide this information?

You are not authorized to access this page. Please log in with more privileges and try again.

It looks like you used the wrong project key, since i assume you have access to your project at SC. To make sure about that, please just access the dashboard of your project, it must looks similar to this:

https://sonarcloud.io/dashboard?id=organization_project

Then copy the id value (in above example is organization_project) and use it in the link i provided previously (the project key is compounded by the organization name plus the project name).

That’s what I’d done, only to be confronted with the permissions issue

OK, so can you open the project at SC, got to “Administration”, then “New code” and get the new code configuration? We need this in order to understand better the reported issue between runs. If you are not an administrator, could you ask for who have this permission at your organization to do that please?

1 Like

Digging a bit further, I even got as far as looking at the build logs a bit further, and compared a “good” build with a “bad” one (i.e. covered everything, vs. missed parts), and it looks like for some reason, the Roslyn analyser isn’t being used for every project within the solution, so the blocks looking like:

SonarQubeCategoriseProject:
Sonar: (MyProject.csproj) Categorizing project as test or product code…
Sonar: (MyProject.csproj) Project categorized. SonarQubeTestProject=False
CreateProjectSpecificDirs:
Creating directory “D:\a_work\101.sonarqube\conf\6”.
SonarQubeImportBeforeInfo:
Sonar: (MscUk.Disbursements) SonarQube.Integration.ImportBefore.targets was loaded

are missing from some (not all) projects in the solution, this means the call to csc.exe (or vbc.exe) doesn’t include all the SonarCloud Roslyn analysers, which would account for the difference. There doesn’t seem to be a pattern - i.e. it’s not all up to point x in the build, nor is it always the same set missed.

So the big question now, is why isn’t the SonarQubeCategoriseProject task happening intermittently for some projects within a solution? My gut feel is there’s a race condition “somewhere”, but we have tried disabling build in parallel to see if that would assist, to no avail.

Further additional info; we’d previously noticed this with solutions using TFVC source control within DevOps, but I’ve just personally seen one of our projects that uses git within DevOps “fall” from ~108k lines of code to ~8k lines of code, even though the output artefacts are comparable (i.e. everything built, but the SonarCloud integration only considered 1 of the 54 projects within the solution).

Having consulted with some colleagues in our overseas offices (that use different build agent pools), they’re also seeing similar behaviour, so this doesn’t feel like it is something peculiar to how the build agents are configured, but some fault within the SonarCloud product.

The problem appeared recently since 2-3 months for all of our sonarcloud projects. Those are using different pipelines and different build agents, analysis are done using msbuild and code is under git.

1 Like

@RowlandShaw, could you provide the analysis id for both of these analysis that falls from 108k to 8k LOC?

Where do I see that @Alexandre_Holzhey?

Go to your project at SonarCloud, then Administration -> Background Tasks.

Good (108k): AXbeJAI8qlLCx7YFcQ7q
Not good (8k): AXbi0xd6WAbs_8zp9jH2

Status Task ID Submitter Submitted Started Finished Duration
Haulage/69239[Project Analysis] AXb2p4vzzxFcwH3zv3IN ovhq-sonarcloud57619 January 12, 2021 1:52:24 PM 1:52:24 PM 1:52:26 PM 2.646s
Haulage/69085[Project Analysis] AXbyVxCd3LYM3ydKL3o3 ovhq-sonarcloud57619 January 11, 2021 5:46:00 PM 5:46:01 PM 5:46:03 PM 2.200s
Haulage[Project Analysis] AXbxbLKoD7p96JBYmAeO ovhq-sonarcloud57619 1:30:01 PM 1:30:01 PM 1:30:31 PM 30s
Haulage/69014[Project Analysis] AXbxTPP4SIYXPDHJumzP ovhq-sonarcloud57619 12:55:21 PM 12:55:21 PM 12:55:23 PM 2.393s
Haulage/68967[Project Analysis] AXbw2qHojEo5PlRO32lD ovhq-sonarcloud57619 10:50:28 AM 10:50:28 AM 10:50:33 AM 4.493s
Haulage[Project Analysis] AXbi0xd6WAbs_8zp9jH2 ovhq-sonarcloud57619 January 8, 2021 5:27:33 PM 5:27:33 PM 5:27:39 PM 5.605s
Haulage/68828[Project Analysis] AXbiydKRSIYXPDHJuff1 ovhq-sonarcloud57619 5:17:26 PM 5:17:26 PM 5:17:28 PM 1.975s
Haulage[Project Analysis] AXbittymD7p96JBYl47d ovhq-sonarcloud57619 4:56:43 PM 4:56:43 PM 4:56:55 PM 11s
Haulage/66443[Project Analysis] AXbiqrL2jEo5PlRO3vWm ovhq-sonarcloud57619 4:43:26 PM 4:43:26 PM 4:43:29 PM 2.370s
Haulage/66441[Project Analysis] AXbinR-hD7p96JBYl4rq ovhq-sonarcloud57619 4:28:36 PM 4:28:37 PM 4:28:39 PM 1.980s
Haulage/68828[Project Analysis] AXbidjNCqOHY_ryos65V ovhq-sonarcloud57619 3:46:05 PM 3:46:06 PM 3:46:08 PM 2.346s
Haulage/68773[Project Analysis] AXbiXmKFWAbs_8zp9h4S ovhq-sonarcloud57619 3:20:05 PM 3:20:05 PM 3:20:07 PM 2.534s
Haulage[Project Analysis] AXbiU671qOHY_ryos6ml ovhq-sonarcloud57619 3:08:24 PM 3:08:24 PM 3:08:45 PM 21s
Haulage/68773[Project Analysis] AXbiNRxQWAbs_8zp9hh3 ovhq-sonarcloud57619 2:35:00 PM 2:35:00 PM 2:35:02 PM 2.12s
Haulage[Project Analysis] AXbiHtRQqOHY_ryos6Lh ovhq-sonarcloud57619 2:10:40 PM 2:10:40 PM 2:11:00 PM 20s
Haulage/68779[Project Analysis] AXbiBrlmpO7BhKmbwCsP ovhq-sonarcloud57619 1:44:20 PM 1:44:20 PM 1:44:22 PM 1.831s
Haulage/68773[Project Analysis] AXbh-C9ED7p96JBYl3SS ovhq-sonarcloud57619 1:28:27 PM 1:28:27 PM 1:28:30 PM 2.545s
Haulage/68769[Project Analysis] AXbh6UTrpO7BhKmbwCeb ovhq-sonarcloud57619 1:12:09 PM 1:12:10 PM 1:12:12 PM 2.167s
Haulage/68569[Project Analysis] AXbhunC5SIYXPDHJudEf ovhq-sonarcloud57619 12:21:01 PM 12:21:01 PM 12:21:03 PM 2.306s
Haulage/68635[Project Analysis] AXbhgeFcf_sbMWRx1XVC ovhq-sonarcloud57619 11:19:14 AM 11:19:14 AM 11:19:16 AM 2.361s
Haulage[Project Analysis] AXbhgNTVf_sbMWRx1XUT ovhq-sonarcloud57619 11:18:05 AM 11:18:05 AM 11:18:28 AM 22s
Haulage/68727[Project Analysis] AXbhRAtSbEg9RCJd6hft ovhq-sonarcloud57619 10:11:41 AM 10:11:41 AM 10:11:43 AM 1.884s
Haulage[Project Analysis] AXbeJAI8qlLCx7YFcQ7q ovhq-sonarcloud57619 January 7, 2021 7:37:50 PM 7:37:50 PM 7:38:13 PM 22s

Thanks a lot for the information. I can see here that these analysis have indeed a big difference in NLOC, file type and also in the report size sent to SonarCloud. That leads me to ask you about the scanner logs for these analysis, you should have them at the pipeline logs in AzDo. Could you please send both to us? Tell me if they should be handled privately.

Would prefer to send those privately @Alexandre_Holzhey

there isn’t other companies impacted by this issue?

@RowlandShaw and @ghaouim: do you have set any RunAnalyzers / RunAnalyzersDuringBuild property in their msbuild command line or in any config file ?