Is there any advice for Azure DevOps users who are attempting to parallelize a pipeline using jobs/stages? For example, if I’m using OWASP Dependency Check with sonarqube, I’d like to parallelize the build/test of the solution with the dependency check processing, then fan-in to perform the sonarqube analysis. Dependency check is good at publishing its simple results to Azure Artifacts, but sonarqube preparation and following build/test would scatter artifacts all over the workspace that would have to be published to artifacts and then re-downloaded to a relatively clean workspace for the “finalization” job. For example, there is the nicely contained .sonarqube directory, but then there’s the:
- RoslynCA.json files stored deep in the build output for C# projects
- Test Execution results files (e.g. *.trx)
- Code Coverage results files (e.g. SonarQube.xml, opencover, etc.)
I’m sure there are others too.
The ones that scare me the most from the above list are the RoslynCA.json files that exist in folders that wouldn’t appear in the final job’s “clean”, non-built folder structure.
You could upload everything from build/test to artifacts, but that is way too much data.
Note that I’m using Azure DevOps self-hosted agents, in case that helps my cause at all.
Any advice would be appreciated.