SonarCloud analysis gets stuck at 12/13 files

Hi @SoyRonyVargas,

Thanks for this update. It helps. We’re still figuring out the root cause, but this helps us home in.

 
Ann

Thank you very much, I’m keeping an eye on it.

1 Like

Hi all,

We are currently unable to reproduce this, and need more input. (See my earlier request for thread dumps.) In light of that, and since there’s a workaround:

Our current plan on this is to deploy an update on Monday that adds copious, sonar.verbose=true logging around this part of the process and ask you all to re-post your logs once we’ve done that.

 
Ann

7 Likes

Hej Ann,

We are facing the exact same issue since yesterday evening, but found this thread only now.
Attached is a thread dump from one of our machines, hope it helps finding the root cause.
sonar-scanner_thread_dump.txt (38.0 KB)

Thanks for the workaround, worked like a charm.

Best Regards,
Tim.

We encountered exactly the same issue. However, after adding the parameter -Dsonar.scm.use.blame.algorithm=GIT_FILES_BLAME, it started working for us.

1 Like

@ganncamp

Hello, good morning. We’re still experiencing the same issue today, the file scan gets stuck.

13:54:30.985 INFO  12/13 source files have been analyzed

Hi @SoyRonyVargas,

Can you try the workaround?

 
Ann

@ganncamp

Yes, but does that have any different impact compared to not including it?

Hi,

As you might guess from the parameter name, it uses a different blame algorithm than the default. I’m not briefed on the low-level differences. The idea was to unblock users.

 
Ann

Hello Ann,

We are also having the “analysing gets stuck at files” issue.
The workaround is working for us with sonar.scm.use.blame.algorithm=GIT_FILES_BLAME

Attached is our log file.
sonarclouddump.txt (109.9 KB)

1 Like

Hi ! confirmed workaround : `

  • name: SonarCloud Scan
    uses: SonarSource/sonarqube-scan-action@v5.2.0
    env:
    SONAR_TOKEN: ${{ sonar-token }}
    with:
    args: >
    -Dsonar.verbose=true
    -Dsonar.scm.use.blame.algorithm=GIT_FILES_BLAME`

+1

We have also started to be affected by this new bug. Builds hanging on the final file during analysis.

We’re using Azure Devops

1 Like

For AzureDevops, I can confirm the temporary workaround did work on getting us back up and running. Adding sonar.scm.use.blame.algorithm=GIT_FILES_BLAME to the SonarCloudPrepare task. I’ve provided the full example in case it helps anyone else.

  • task: SonarCloudPrepare@3
    displayName: “Prepare SonarCloud”
    inputs:
    SonarCloud: ‘$(tfsSonarCloudServiceConnectionName)’
    organization: ‘$(sonarCloudOrganisation)’
    scannerMode: ‘dotnet’
    projectKey: ‘$(projectName)’
    projectName: ‘$(projectName)’
    projectVersion: ‘$(Build.BuildNumber)’
    extraProperties: |
    sonar.exclusions=$(sonarCloudExclusions)
    sonar.test.exclusions=$(sonarCloudTestExclusions)
    sonar.cs.opencover.reportsPaths=$(Agent.TempDirectory)/**/coverage.opencover.xml
    sonar.cs.vstest.reportsPaths=$(Agent.TempDirectory)/*.trx
    sonar.projectBaseDir=$(System.DefaultWorkingDirectory)/$(projectName)
    sonar.scm.use.blame.algorithm=GIT_FILES_BLAME

It appears sonar.scm.use.blame.algorithm has two possible values:

  • GIT_FILES_BLAME (uses JGit)
  • GIT_NATIVE_BLAME (uses git)

See Scan on git blobless clone failed - #11 by Julien_HENRY

Hi all,

We’ve deployed a new version to production (sorry for the delay!). Could you please briefly

  • remove -Dsonar.scm.use.blame.algorithm=GIT_FILES_BLAME
  • add -Dsonar.scanner.scm.echoAll=true

on your analysis command line and post the resulting log?

Note that this backs out the workaround. So the problem comes back. But (hopefully) it gives us a better chance of understanding the problem and fixing it long-term.

 
Thx!
Ann

Hi Ann,

The new production version seems to have fixed it for us. We don’t have to use the workaround anymore.

Thanks!

1 Like

Hi @ywong,

Thanks for the update!

Are you saying that once you add the extra logging, the problem goes away? Or is everything fine without the workaround and without the extra logging?

 
Thx,
Ann

Its working again without workaround and without the extra logging, so no configuration of

-Dsonar.scm.use.blame.algorithm=GIT_FILES_BLAME
-Dsonar.scanner.scm.echoAll=true
1 Like

Hi. We are currently experiencing this exact issue. We are using sonar via the Azure DevOps pipelines tasks extension. Have the changes made to the production version impacted the CI task? Wondering if we need to implement a workaround or wait till it’s resolved, thanks.

Hi @martinC,

Can you add sonar.scanner.scm.echoAll=true to your analysis properties and post the log here, please?

 
Thx,
Ann

Same here, SonarCloud enterprise in Azure DevOps. However, something strange which might be useful for figuring out where the issue might be. We have a default branch (develop) which has a scheduled pipeline with a SonarScan (among others) that runs each night. Since last night (14 hours ago), it fails on the SonarCloud Analyze step with each 1 sec. giving a log-line:

INFO 1/2 source file have been analyzed

The pipeline has a 60-min timeout so eventually it is killed.
This behavior is still happening.

However: if I run a branch-pipeline, the SonarCloud Analyze step runs just fine in about 1:30 min.

Other differences:
The develop branch has a number of issues detected in SonarCloud, while the branch has none (new issues).