AzDO SonarQubeAnalyze on pipeline with multiple jobs

Hi everyone,

I’m looking for guidance on a CI/CD setup question around SonarQube analysis, where tests and coverage generation are split across multiple jobs.

In a simple pipeline, a typical sequence is:

  1. SonarScanner begin (prepare)

  2. Build

  3. Run tests (collect coverage, outputs `.coverage` files)

  4. SonarScanner end (analyze/publish)

This works well when build and tests happen within the same job/agent workspace, but our test suites are distributed across multiple jobs. If we ran SQ begin/end workflow in every job, we would obtain result only for a single job run, possibly the one that last published the results as to SQ, making our coverage incomplete.

What we do to mitigate is we run a separate job. This job waits for completion of test suites run. Then downloads test results, including .coverage files, produced by VSTest tasks. .coverage files are merged into single .coveragexml using

dotnet-coverage merge -o <.coveragexml-output-path> -f xml <input-.coverage-files>

which is then passed to SonarQubeAnalyze.

Is there a way to run the prepare/begin and end/analyze steps in a way that supports multi-job test execution thus avoiding manual mergeing of .coverage results?

I think this topic: Provide test results from multiple steps discusses similar issue to what I’m working on now, but since it’s 6 years old, I’d like to ask if there’s been an update on SQ side in terms of usability?

Thanks!

Bruno.

Hi Bruno,

This is not a topic we’ve worked on in the last 6 years, and I’m not aware of any concrete plans to do so in the future. You’ll have to stick to your current methods.

 
Ann