SonarCloud Analysis in Azure DevOps with Tests Run Across Multiple Agents

  • CI system used: Azure DevOps
  • Languages of the repository: C#

I’m currently working on enhancements to my CI pipeline. Currently the pipeline runs in a single job in roughly the following order:

  • Restore NuGet packages
  • Prepare Sonar Cloud analysis
  • Build solution
  • Run unit tests
  • Run acceptance tests
  • Run Sonar Cloud analysis
  • Publish artifacts

To speed this up I’ve separated this out into a few different stages and have parallelized the tests across multiple agents:

Build Test Artifacts (single agent)

  • Restore NuGet packages
  • Build test projects
  • Publish test artifacts

Run Unit Tests (multiple agents)

  • Download unit test artifacts
  • Run unit tests

Run Acceptance Tests (multiple agents)

  • Download acceptance test artifacts
  • Run acceptance tests

Build and Publish Application (single agent)

  • Restore NuGet packages
  • Build solution
  • Publish artifacts

My question is, at what points should I now run the SonarCloudPrepare task and the SonarCloudAnalyze task?

In the documentation it states:

The analysis is always split into two parts in your build process; the begin step and the end step. In between, you perform the actual build and your tests. To enable coverage reporting, you need to make the following changes:

  • In the scanner begin step, add the appropriate parameter to specify the location of the coverage report file that will be produced.
  • Just after the build step but before the scanner end step, ensure that your test step produces the coverage report file.

But now the build and tests are performed on separate agents, I don’t think I can adhere to this.

Hi Adam,

Unless something has changed recently sonarcloud doesn’t support a scan over multiple build agents. I had a similar issue as my company had a Visual Studio solution with multiple projects in. I have tried setting it up on a parallel run and it just didn’t work. Looking at the forums later found it isn’t supported.

I originally ran the sonarcloud scan in one go and this was too much for the AzureDevOps pipeline - it exceeded the resources available and it ran past the allowed time. I started using monorepos instead which is where you have a project with lots of subprojects.

Monorepo support & SonarCloud (sonarsource.com)

While this may take longer to scan the entire project (you’ll have to run multiple pipelines) if you do it all at once; if you only scan the projects that change within your repo this can result in smaller scan times and PR gates become a lot quicker.

To make sure you don’t go over your line count, set up each subproject with project inclusions so you aren’t pulling the scans for each dependency project.

Hi Adam, welcome to the community!

I think your pipeline should work as is.
The begin/end steps should be around a build stage and be provided with a path to coverage files.

If you build test artifacts first, run tests and collect coverage in parallel stages, then build again in the last stage and pass in the path to the coverage artifacts from all previous test stages (fan-out, fan-in), you should be OK.

You can use the standard .NET scanner coverage parameters to indicate a path to the coverage files. For example:

extraProperties: |
sonar.cs.vscoveragexml.reportsPaths=folder/*.xml

or

sonar.cs.opencover.reportsPaths=“$(Build.SourcesDirectory)/coverage/**.xml”

You can find the test coverage properties in the documentation.
The AZDO extension has some logic built-in to automatically detect tests, but if you run them in an unexpected way, you can use these to point the scanner to it.

Hope this helps.

Denis.

1 Like

Hi Denis,

Thanks very much for the detailed answer. That should really help. I’ll give it a shot this afternoon. Much appreciated.

Thanks,
Adam

Hi Denis,

Has support for multiple agents (different machines) been added? I would be interested if it has.

Looking at this post in 2021 it isn’t supported.
Multi job Azure DevOps Pipeline - SonarCloud - Sonar Community (sonarsource.com)

Kind regards

Andy Brady

Hi @andybrady,

You are right, there might be a problem here that @Adam_Kimberley might run into, unfortunately.

I thought his pipeline could work because there is only one Prepare/Build/Publish chunk, the last one.
I thought given this, the fact that there is a first build just for tests and then mmultiple parallel jobs for coverage could work, provided the coverage artifacts are downloaded in the last stage and their local path is given to the task.

However, one comment in the thread you linked to makes me think there might be an issue if the paths are different on the various agents, and that might create problems with coverage.
I will ask the team here and see what they think.

Hi again,

I just confirmed with the team that we do in fact rely on local file paths both for analysis and coverage, so this trick will not work.

I will file that in the “interesting and annoying problem to look at” pile, though.

Sorry @Adam_Kimberley, I am afraid you are stuck with a serialized pipeline.

Thanks for the update. I didn’t get a chance to try your suggestion so wasn’t able to confirm whether it’d be an issue or not, but sounds like it’s a no-go. Thanks for putting this on the pile-to-look-at. Unfortunately, our team’s requirement for a faster pipeline currently trumps our requirement for SonarCloud, so in this case, we’ll probably just have to drop our use of SonarCloud for now - which is a real shame! Will stay tuned to see whether this gets fixed in the future. Thanks again for your help.

@Adam_Kimberley

There are ways around this. You can use a self hosted machine which can have more power assigned to it. From a cost perspective, I would recommend buying a server/workstation and installing the Azure DevOps agent locally. Something with 16 Cores and 16/24GB of RAM should keep the speed of the runs up at a reasonable level.

To set up a selfhosted agent see article below:
How to Setup a Self-Hosted Agent for an Azure DevOps Pipeline – andrewhalil.com.

The second option is to set up your Sonarcloud project as a monorepo and scan individual projects. This works out very well for us, we only scan the bits that change rather than the entire solution. This does take some thinking but it does work on the tiny Microsoft Hosted machines.

@denis.troller If there is a feature that we could request for future development, could we have support for parallel runs to support Azure DevOps. The agents provided by Microsoft are limited in power but the potential to scale out is huge. This would probably be applicable to most Cloud CI providers.

@andybrady Yes I agree that this would be a potential gain. We have recorded the need and will definitely look at it in the future.

1 Like