Azure Pipelines with multiple jobs splitting up build/test and sonarqube analysis

Is there any advice for Azure DevOps users who are attempting to parallelize a pipeline using jobs/stages? For example, if I’m using OWASP Dependency Check with sonarqube, I’d like to parallelize the build/test of the solution with the dependency check processing, then fan-in to perform the sonarqube analysis. Dependency check is good at publishing its simple results to Azure Artifacts, but sonarqube preparation and following build/test would scatter artifacts all over the workspace that would have to be published to artifacts and then re-downloaded to a relatively clean workspace for the “finalization” job. For example, there is the nicely contained .sonarqube directory, but then there’s the:

  • RoslynCA.json files stored deep in the build output for C# projects
  • Test Execution results files (e.g. *.trx)
  • Code Coverage results files (e.g. SonarQube.xml, opencover, etc.)

I’m sure there are others too.

The ones that scare me the most from the above list are the RoslynCA.json files that exist in folders that wouldn’t appear in the final job’s “clean”, non-built folder structure.

You could upload everything from build/test to artifacts, but that is way too much data.

Note that I’m using Azure DevOps self-hosted agents, in case that helps my cause at all.

Any advice would be appreciated.


Hi @Ross_Beehler

Sorry for answering late to you.

While this is a good idea to upload files we want at certain point in time for an analysis, as you may know, this is not currently supported.

As we are closely looking at file paths in those diverse coverage, roslyn json files, there are close to no solution for achieving what you want at first, except doing a final build/analysis with all those file at the same place.

This is something we have in mind, but there’s no ETA to give you on that.