Failed to Publish

Template for a good new topic, formatted with Markdown:

  • ALM: Git on Azure DevOps Server (on-premise)
  • CI system used: Azure DevOps Server (on-premise)
  • Scanner command used when applicable (private details masked)
  • Languages of the repository: C#, XAML
  • Error observed (wrap logs/code around with triple quotes ``` for proper formatting)
  • Steps to reproduce: Run a build
  • Potential workaround: None

This is the error I get:

[ERROR] SonarQube Server: Error while executing   task Publish: Task failed with status FAILED, Error message: There was an   issue whilst processing the report. Please contact support with the Project   Analysis ID: AZkDENqNV8ArEFXL4Bjb. 

   

Task failed with status FAILED, Error message:   There was an issue whilst processing the report. Please contact support with   the Project Analysis ID: AZkDENqNV8ArEFXL4Bjb. 

Commands:

  - task: SonarQubePrepare@7
    displayName: 'SonarQube: Prepare analysis'
    inputs:
      SonarQube: 'SonarCloud'
      organization: '<snip>'
      scannerMode: 'dotnet'
      projectKey: '<snip>'
      projectName: '<snip>'
      projectVersion: '$(AssemblyVersion)'
      extraProperties: |
       sonar.host.url=https://sonarcloud.io
       sonar.verbose=true
       sonar.dotnet.excludeGeneratedCode=true
       sonar.exclusions=**/bin/**,**/obj/**,**/*.Designer.cs,**/*.generated.cs,**/AssemblyInfo.cs,**/CompositionAssemblyInfo.cs,**/CompositionReferences.cs,**/MicrosoftLoggerExtensions.cs,**/microsoft.net.test.sdk/**,**/*.xaml,**/*.xaml.cs,BL/Composition/Impl/Composition.cs,Common/**,Config/**,HAL/**,Integration/**,**/*.cpp,**/*.c,**/*.h,**/*.hpp,**/*.cc,**/*.cxx,**/*.hxx,**/*.xml,**/*.png,**/*.json,**/*.ttf,**/*.exe,**/*.dll,**/*.jpg,**/*.nhax,**/*.zip,**/*.ico,**/*.html,**/*.sln,**/*.pbin,**/*.ps1,**/*.dat,**/*.txt,**/*.yml,**/*.sqlite,**/*.dotsettings,**/*.props,**/*.md,**/*.PREVENT_GIT_LFS_ICO,**/*.xsd,**/*.nuspec,**/*.editorconfig,**/*.nupkg,**/*.targets,**/*.datasource,**/*.ruleset,**/*.xlsx,**/*.cd,**/*.bin,**/*.tt,**/*.docx,**/*.resx,**/*.sequencediagram,**/*.wsdl,**/*.runsettings,**/*.log4net,**/*.ttinclude,**/*.xsl,**/*.bat,**/*.layout,**/*.manifest,**/*.settings,**/*.psd1,**/*.svcinfo,**/*.url,**/*.cmd,**/*.crt,**/*.gitattributes,**/*.key,**/*.py,**/*.svcmap,**/*.vssettings,**/*.psm1,**/*.csprojtest,**/*.playlist,**/*.sample,**/*.gitkeep,**/*._
       sonar.test.exclusions=**/*Test*/**,**/*Tests*/**,**/*Test.cs,**/*Tests.cs,**/ComponentTests/**,**/UnitTests/**,**/IntegrationTests/**,**/RegressionTests/**,**/ComponentTest/**,**/UnitTest/**,**/IntegrationTest/**,**/RegressionTest/**
       # NB: **/SourceGeneratedDocuments/* being excluded does not impact the analysis
       sonar.scanner.scanAll=false
       sonar.scm.use.blame.algorithm=GIT_FILES_BLAME

- task: MSBuild@1
    displayName: Build Phoenix
    inputs:
      solution: Phoenix.sln
      msbuildVersion: latest
      platform: '$(BuildPlatform)'
      configuration: '$(BuildConfiguration)'
      msbuildArguments: >-
        /p:ReleaseName=$(Wdh.ReleaseName)
        /p:ResultsFolder=$(Wdh.BinariesDirectory)
        /p:EnableNuget=False
        /p:SkipCopyUnchangedFiles=true
        /p:CopyRetryCount=500
      maximumCpuCount: true

  - task: VSTest@2
    displayName: 'Test '
    inputs:
      testAssemblyVer2: |
       s\**\bin\$(BuildConfiguration)\*.*(Unit|Component|Integration|Regression)Test.dll
       b\**\Bin.Test\**\*.*(Unit|Component|Integration|Regression)Test.dll
      searchFolder: '$(Agent.BuildDirectory)'
      testFiltercriteria: 'TestCategory!=LocalRunTest'
      runInParallel: true
      runTestsInIsolation: true
      codeCoverageEnabled: true
      publishRunAttachments: false
      failOnMinTestsNotRun: true
      runSettingsFile: $(Build.SourcesDirectory)\Build\BuildDefinitions\CodeCoverage.runSettings
      rerunFailedTests: false
    continueOnError: true

  - task: SonarQubeAnalyze@7
    displayName: 'SonarQube: Run analysis'

  - task: SonarQubePublish@7
    displayName: 'SonarQube: Publish result'
    inputs:
      pollingTimeoutSec: 3600
    condition: succeededOrFailed()

Hi @TimBarnett,

Welcome to the community.

I’ve checked the logs for this analysis id and I found a timeout exception while running on of the report processing steps. Did you try to run the analysis again? It is usually a temporary error.

Cheers.

Javier

Hi Javier,

Thanks for the welcome and the response.

This build has been running and failing daily.

The only time we have had success is when we have sent a much smaller part of our solution to SonarQube Cloud.

Kind regards,

Tim

Hi @TimBarnett ,

Did you try to run the analysis in your CI instead of using the AutoScan?

Cheers,

Javier

We used to run it with on premise SonarQube.
What do I need to check for running the analysis in CI instead of AutoScan?

Hi @TimBarnett ,

Forget what I said. The analysis is failing when processing the report. The code is correctly analyzed in your CI.

Let me investigate a bit more.

Cheers.

Javier

Hi again @TimBarnett ,

Might you provide some more failing analysis IDs so I can gather more logs?

Thanks

Hi Javier,

Thanks for your help with this.
Here are some IDs for analysis of this solution - all from this month.
ID: AZlVdB5sgecrf-4eTzna
ID: AZlQTpDqZy_gRLDwdM7T
ID: AZlMW6mL8voOwITNb5SI
ID: AZlLKEy9WnJmXhdvAlE9
ID: AZk7tAf_6lDL3rbLdlQi
ID: AZk2kOvq04-vfYmfjBEg
ID: AZky4DHsPEkdXX_veURX
ID: AZkxaeqMbenILqaHiQ34
ID: AZksQ2-5W9-pdjsMiZqM
ID: AZknG6nS8ItxgxHtZ-AW
ID: AZkXrDuQLKMTLWSBECVv
ID: AZkSgDuNIbqXLiHX_8RA
ID: AZkNXKLUsbp22lAfldrw
ID: AZkINE1qND9go_QMXZn2

ID: AZkDENqNV8ArEFXL4Bjb

Thanks and kind regards,
Tim

Hi Tim,

I see this log trace:

Detect file moves | reportFiles=8356 | dbFiles=7290 | addedFiles=8352 | removedFiles=7286 | directMatches=0 | status=FAILED | time=600000ms

Is it possible that there is such amount of file moves? It seems we are getting a timeout after 10 minutes.

Cheers,

Javier

Hi Javier,

I’m not expecting that many files to be moved on a daily basis. Do I need to take steps to restrict the amount of history being sent to SonarQube?

Thanks,
Tim

Hi @TimBarnett ,

It seems that at some point there were a lot of file moves. Might you check it out?

I’m not sure how to skip that history so new analysis can work without considering such amount of changes.

@Colin , might you help with this?

Cheers,

Javier

Hi,

I have restricted the fetch depth to 30 and I am still unable to publish to SonarQube Cloud.

The largest numbers of files changed in those thirty commits are 132 then 43.

The latest build status (since reducing the depth to 30) is:
Task failed with status FAILED, Error message: There was an issue whilst processing the report. Please contact support with the Project Analysis ID: AZlbmX1tyk94AgGMyegI.

Thanks,

Tim

Hi @TimBarnett ,

I’m trying to get more help but it seems there is no an easy workaround solution ATM.

Cheers,

Javier

Hey there @TimBarnett

Would it be possible for you to temporarily point analysis at a new project key, and compare the results with your existing project? The goal would be seeing if file paths are radically different, or if more files are being included all of a sudden.

Hi @Colin,

I have a temporary project to use now, but I am trying to have the yml file to configure the build on a different branch (so I don’t have to wait for our full build and approval flow).

Even though I am trying to get it to use the master branch, SonarCloud shows that the master branch has not been analyzed.

Hi @Colin,

I now have it working on a temporary project.

What information can I give you that would be beneficial?

Or should I just switch to a new project (not a temporary one) in SonarCloud?

Take a look in the Code tab and compare the two projects. Do you see a change in the directory structure / file paths to get to a specific file?

For the temporary project, I see 600K lines of code, and I see four folders.

Those four folders are at the same level in the normal (not temporary) project, but there are other files and folders there too (as there should be for the full solution).

The main project only shows lines of code in the folder that was not excluded from analysis (when I was working on getting some code to SonarCloud - working around the lines of code issue).

Hi @TimBarnett

We are investigating what could cause so many “moved” files (this might be a configuration change on your side, or an unexpected change in the Scanner for .NET), and also discussing what we could do to not timeout the analysis when there are so many moved files.

This is not an easy topic, so in the meantime, if you don’t have too much issue history (comments, accepted issues, …) or if you are fine to lose it, that would be simpler to restart from a clean SonarQube Cloud project.