Multi job Azure DevOps Pipeline

Hello! I have a multi job AzDo build pipeline that’s roughly set up in the following way:

jobs:
- SonarCloudPrepareAnalysis
    displayName: 'SonarCloud Prepare Analysis'
    steps:
      - task: SonarSource.sonarcloud.14d9cde6-c1da-4d55-aa01-2965cd301255.SonarCloudPrepare@1
        displayName: 'Prepare analysis on SonarCloud'
        inputs:
          SonarCloud: [private repo, redacted]
          organization: [private repo, redacted]
          projectKey: [private repo, redacted]
          projectName: [private repo, redacted]
- BuildAndRunBackendTests
- BuildAndRunFrontEndTests
- DockerBuild
- SonarCloudAnalysis
    displayName: 'SonarCloud Analysis'
    steps:
      - task: SonarSource.sonarcloud.ce096e50-6155-4de8-8800-4221aaeed4a1.SonarCloudAnalyze@1
        displayName: 'Run Code Analysis'

      - task: SonarSource.sonarcloud.38b27399-a642-40af-bb7d-9971f69712e8.SonarCloudPublish@1
        displayName: 'Publish Quality Gate Result'

I’m currently getting the The ‘Prepare Analysis Configuration’ task was not executed prior to this task when running the above - assumedly due to the fact that my PrepareSonarCloudAnalysis, build, and RunCodeAnalysis tasks are all on different jobs - and thusly different agents.

Is there any way to implement SonarCloud in a multi job pipeline that doesn’t involve publishing a large chunk of artifacts? If not - exactly which artifacts are required for the RunCodeAnalysis task?

I noticed that a potentially similar issue was already posted here, for reference.

Thank you very much in advance, I’m excited to start playing around with some of the tools you guys have available!

Hi @tjackson welcome to the community.

Unfortunately that is not something we currently support, for quite few reasons

  • Integrating with .NET / MSBuild world : we benefit from this Prepare analysis configuration to install some build targets : though it might be your case, that’s something to consider
  • We are relying on paths for computing some metrics (ie coverage), this is not easy to guess on a user’s system if this has changed from the original place file they have been analyzed vs the place they were on the filesystem
  • We’re also using environment variable to kind of communicate between both tasks, again we don’t expect to pass it through multiple jobs as it can have impact on the value itself.

That makes it quite challenging for us, though i’m not saying we will not support it in the future.

Hope that clarifies

Thanks !

Hi @mickaelcaro !

I understand - what would the proper workaround be in my case? From what I can clean, I have two options:

  1. Add the Prepare task to every job that contains a MSbuild, publish the artifacts, and aggregate the artifacts at the end of the pipeline for Analysis
  2. Add the Prepare AND Analysis hooks to each job that contains a MSbuild

Is there a best practice here? The post I linked in the OP implies that #1 may be more difficult, leaning on the impossible at present, though I’m concerned with #2 that results will get overwritten?

Thanks!
Tyler