Code Coverage mismatch between Azure DevOps and SonarCloud

We are currently encountering an issue where the code coverage percentages do not match between Azure DevOps and SonarCloud for our repositories.

We are using Azure DevOps as our CI/CD tool, and integrating with SonarCloud for code analysis and code coverage. The issue applies to codebases in .NET Core 3.1/6, and probably applies to .NET Framework 4.7.2 as well.

In our builds, we are using the following command to collect code coverage results: dotnet test --collect "Code coverage". This seems to use the built-in Microsoft method for collecting code coverage, and automatically generates a report for consumption by Azure DevOps.

The “Publish test results and code coverage” option is also checked, which seems to automatically enrich the command with the --logger trx --results-directory $(Agent.TempDirectory) options.

We are using the standard “prepare” and “run code analysis” steps in our pipeline, with the "dotnet test command occurring between those steps. We do not have any additional steps that specifically generates the code coverage report, or manually publishes the code coverage report to a specialized directory.

With this configuration in place, whenever a build executes, Azure DevOps shows a Code Coverage tab with a summary number that is entirely different than what SonarCloud provides. From what I can tell, Azure DevOps is scanning every DLL as part of the project (not just projects that are part of the solution in the main codebase), while SonarCloud seems to only be accounting for code in the actual solution, and automatically ignoring “test” projects (which is what I expected).

This discrepancy raises a few questions:

  • How can I have Azure DevOps report the same results as SonarCloud within the Azure DevOps portal? is this actually possible?
  • Is there a different code coverage configuration we should be using instead in order to achieve consistency? It seems that coverlet might be the better option, but I’m not sure if that will resolve the discrepancy between Azure DevOps and SonarCloud’s interpretations of the results
  • Is there a recommended Visual Studio plugin to see the code coverage results in a similar fashion to what SonarCloud provides? We don’t have VS Enterprise for developers (so Microsoft’s default plugin is not an option here), and I’ve been evaluating a tool called Fine Code Coverage, which seems to support a majority of the different code coverage collection methods. However, I am unable to get the results of the Microsoft default calculation (i.e. – collect “Code Coverage”) to match what SonarCloud provides.

Hey there.

Just to provide a Sonar perspective, it sounds like the results you’re getting on SonarCloud are the ones you expect, rather than the one Azure DevOps reports.

So this is probably a better question for Azure DevOps community or Visual Studio.

But another user may have faced this and found a solution, so fingers crossed somebody chimes in!

I agree with your response, and it’s possible that my issue might be better suited for Azure DevOps or VS community, but I was curious as to what rules/filters are being applied by Sonar Cloud to provide the results I am seeing, if we are not providing any of our own?

For example, are you generating a “runsettings” file automatically with pre-defined filters, or something else? The reason I ask, is because that gives me a starting point to at least match our IDE to Sonar Cloud. Otherwise, I’d then start looking at using different code collection providers entirely, which might produce different results (i.e. coverlet). When those reports reach Sonar Cloud, I’d then need to explain why the reports are different than what Sonar Cloud is doing by default.

My overall goal is to provide a seamless/turn-key experience for our developers, so I decided to start with Sonar Cloud’s “interpretation” of the code coverage results as the “source of truth”. Hopefully, this makes sense, on what I am trying to achieve.

I’m happy to lift the curtain a little bit.

SonarCloud relies on the coverage reports being sent during analysis to determine which lines by covered and which lines are covered.

Unless there’s no information provided for an entire file, in which case SonarCloud tries to determine which lines are “executable” and thus could be covered by test code.

It does this because, in multi-language projects, it’s not accurate to say that a project that has 100% C# Code Coverage but 0% JavaScript Code Coverage has 100% code coverage just because only a C# coverage report has been provided. Equally, if there’s no coverage reported for a C# project in a solution… it’s not fair to ignore that code either.

You can set Code Coverage Exclusions to exclude files from code coverage analysis in SonarCloud. Here, it’s necessary to duplicate anything configured in a .runsettings file that may be preventing some files from being reported in the code coverage report at all.

I also face the same issue with Azure pipeline code coverage and SoundCloud code coverage
I used dotnet sonar shows 68% but Azure pipeline shows 75%

    sonar.cs.opencover.reportsPaths="/agent/_work/_temp/**/coverage.opencover.xml"
    sonar.coverage.jacoco.xmlReportPaths="/agent/_work/_temp/**/coverage.opencover.xml"
    # This code for language focused for analysis
    sonar.language=cs
    sonar.cs.analyzer.projectOutPaths=**/bin/**,**/obj/**
    sonar.verbose=true
    sonar.exclusions=**/*.xml,**/Migrations/**
    sonar.inclusions=**/*.cs
    sonar.pullrequest.provider="Azure DevOps Services"