Target a specific Qualify Profile from Azure DevOps

Hi,

We are currently using SonarCloud from Azure Devops in a couple of pipelines. We have it wired up to our pull request pipeline and also in a nightly pipeline against our main branch.

Due to the length of time the taint analysis takes in our pull request pipeline. We are looking to disable the following rules in that pipeline and only have them run overnight in our nightly pipeline against main branch.

(S2076, S2078, S2083, S2091, S2631, S3649, S5131, S5135, S5144, S5145, S5146, S5147, S5334, S5883, S6096, S6173, S6287, S6399, S6547, S6549, S6639, S6641, S6680, S6776, S7044)

We have configured a separate quality profile with those rules disabled however I can’t figure out if there’s any way to configure the SonarCloudPrepare@3 task to pass in an extra property that would either:

  1. Run that specific Quality Profile, or alternatively
  2. Ignore those specific rules from the pipeline task

Failing that I’m thinking we could potentially use the api to trigger which profile in SonarCloud is active in the project so during the day we make the quality profile without those rules active, then just before the evening run kicks off we update the quality profile to the one with the complete ruleset.

Regards,

Hey there.

It’s not possible to specify a Quality Profile from the analysis itself. This configuration lives on SonarQube Cloud.

You can use POST api/qualityprofiles/add_project to do this programmatically.

That said, it’s sad to hear that the performance isn’t up to snuff! I’d like to know what language(s) you’re analyzing, and how poor performance we’re talking here (5 minutes? an hour?)

Maybe there’s something else to investigate here.

Hi Colin,

Thanks for getting back to me. It’s a C# dotnet analysis that seems to take a long time. We’ve found that when we have those those taint analysis rules our PR scans take a minimum of an hour, but will occasionally jump up to 3 hours for a week or two at a time.

For example recently on Feb 6th all our SonarCloudAnalyze@3 tasks in the pull request pipeline jumped up to the 3+ hour mark and then on Feb 12th it dropped back down to the 60 minute mark which seems to be it’s average. Not sure if those dates coincides with any updates to SonarCloud but it’s not the first time it will jump up to unusable times for a week or so. If we remove those rules we seem to get around a 3 or 4 minute analysis time. My instinct is it’s potentially a memory issue that’s slowing it down. During the day we run our Pull Request pipelines on Microsoft-hosted agents (2 core CPU, 7 GB of RAM), but in the evenings for our run on main we can spin up a more powerful VM agent and run the scan once a night there.

Regards

Hey @chrsdy

Thanks for the additional details!

Nothing should have changed with these rules since early January.

Memory could be a factor, but not guaranteed.

I can ask the team to take a look at your logs. Ideally, you would share:

  • Logs from a PR build
  • Logs from an analysis at the 60 minute mark
  • Logs from an analysis at the 3 hour+ mark
Share the Scanner for .NET verbose logs

Share the Scanner for .NET verbose logs

  • Add /d:"sonar.verbose=true" to the…
    • SonarScanner.MSBuild.exe or dotnet sonarscanner begin command to get more detailed logs
      • For example: SonarScanner.MSBuild.exe begin /k:"MyProject" /d:"sonar.verbose=true"
    • “SonarQubePrepare” or “SonarCloudPrepare” task’s extraProperties argument if you are using Azure DevOps
      • For example:
        - task: SonarCloudPrepare@3
            inputs:
              SonarCloud: 'sonarcloud'
              organization: 'foo'
              scannerMode: 'dotnet'
              projectKey: 'foo_sonar-scanning-someconsoleapp'
              projectName: 'sonar-scanning-someconsoleapp'
              extraProperties: |
                sonar.verbose=true
        
  • The important logs are in the END step (i.e. SonarQubeAnalyze / SonarCloudAnalyze / “Run Code Analysis”)

I can open up a private message if you want to share them privately.

Thanks Colin,

If you could open up a private message I’d be happy to share them privately with you.

Unfortunately the 3+ hour analysis isn’t currently an issue right now as it resolved itself on Feb 12th and we didn’t have verbose logging enabled at the time. However I have some verbose logs from recent pull requests and non-verbose logs from a 3 hour run that I can share.

Regards
Chris

Hey @chrsdy ,

thank you for sharing the logs with us. The behavior indeed looks unexpected. I’ll contact you in a private message too to ask for some additional information that will hopefully help us debug this further.

Best,
Karim.