Missing Blame info

Hi,

I have setup a project on SonarCloud to scan a git repo hosted on Azure DevOps, the build pipeline and scan work fine but I don’t see any blame info when inspecting the code in SonarCloud (blame info is available in Azure DevOps and locally when running the `git blame’ command), here’s the log from the build:

INFO: SCM Publisher SCM provider for this project is: git
INFO: SCM Publisher 14 source files to be analyzed
INFO: Blaming files using native implementation
INFO: Blaming files using native implementation (done) | time=541ms
INFO: SCM Publisher 0/14 source files have been analyzed (done) | time=1088ms
WARN: Missing blame information for the following files:
WARN:   * (list of 14 files)
WARN: This may lead to missing/broken features in SonarCloud
INFO: CPD Executor 31 files had no CPD blocks
INFO: CPD Executor Calculating CPD for 117 files
INFO: CPD Executor CPD calculation finished (done) | time=117ms
INFO: SCM writing changed lines
INFO: SCM writing changed lines (done) | time=259ms
INFO: Analysis report generated in 2675ms, dir size=418 KB
INFO: Analysis report compressed in 213ms, zip size=182 KB
INFO: Analysis report uploaded in 174ms
INFO: ANALYSIS SUCCESSFUL, you can find the results at: ...

This is the first project I have setup in a while, and probably the first one using the new “onboarding” process. I have other git projects/repos in the same subscription I setup over a year ago and they show the blame info correctly.

Hey @alexvaccaro

Can you bump your log level up (in Azure DevOps, this is most easily done by setting the sonar.verbose=true analysis parameter)

Thanks @Colin , see attached log
sonar_blame_build.log (3.0 MB)

Hello @alexvaccaro ,

Two causes are known to commonly cause missing blame information when native Git implementation is used:

  • uncommitted changes on the file
  • shallow clone

Could verify whether you’re in either case?

Would you have the possibility to change the build script to run a git status command on the reported files and capture and share the output?

Cheers!

Hi @sns-seb , the git status shows just before the SonarCloud prepare task shows

HEAD detached at a2cc911
nothing to commit, working tree clean

Apparently the HEAD detached is to be expected on Azure DevOps Pipelines, see Detached Head while building pipeline on Azure DevOps - Stack Overflow

Perhaps I have not understood what you mean by “uncommitted changes”, but if the pipeline is building from a repo hosted on Azure DevOps how would it be possible to have uncommitted changes? By definition you cannot sync or push uncommitted changes to the remote.

@Colin , did you spot anything in the log file I shared?

Hello @alexvaccaro ,

I looked at the log file you shared which lead me to the questions I asked in the previous post.

git status won’t show if the clone is shallow. You can try methods suggested here to find out.

Any process could run after the clone and modify files, which would lead them to appear as having uncommitted changes locally.
However, the git status output you shared seem to exclude this possibility.

Looking at the logs and list of files after WARN: Missing blame information for the following files:, can you identify a pattern? The content of that list seem to change (it is not the same in your initial post and the log file you shared). Maybe that could lead you to something they have in common.

cheers,

Hi @sns-seb ,

OK, I understand, my pipeline doesn’t perform any additional git operations or changes to the files, it is straight build from code with added SonarCloud integration.

The warning lists the files that were changed by the commit operation that triggers the pipeline.

Here’s the yaml pipeline script if that helps:

pool:
  name: Azure Pipelines
  vmImage: 'windows-latest'
  demands:
  - msbuild
  - visualstudio

variables:
  BuildPlatform: 'any cpu'
  BuildConfiguration: 'release'
  SolutionName: MySolution.sln

steps:

- task: NuGetAuthenticate@1
  inputs:
    nuGetServiceConnections: 'Production NuGet Feed'

- task: NuGetCommand@2
  displayName: 'Restore NuGet packages'
  inputs:
    restoreSolution: $(SolutionName)
    feedsToUse: config

- task: CmdLine@2
  inputs:
    script: 'git status'
    
- task: SonarCloudPrepare@1
  inputs:
    SonarCloud: 'SonarCloud_myorg2'
    organization: 'myorg'
    scannerMode: 'MSBuild'
    projectKey: 'myorg_Portal_MySolution.Services'
    projectName: 'Web'
    extraProperties: |
      # Additional properties that will be passed to the scanner, 
      # Put one key=value per line, example:
      # sonar.exclusions=**/*.bin
      sonar.verbose=true

- task: VSBuild@1
  displayName: 'Build Solution'
  inputs:
    solution: $(SolutionName)
    vsVersion: latest
    platform: '$(BuildPlatform)'
    configuration: '$(BuildConfiguration)'

- task: SonarCloudAnalyze@1

- task: ArchiveFiles@2
  inputs:
    rootFolderOrFile: '$(Build.SourcesDirectory)\ArtifactFiles\Acme.MySolution'
    includeRootFolder: true
    archiveType: 'zip'
    archiveFile: '$(Build.SourcesDirectory)\ArtifactFiles\MySolution.Service.zip'
    replaceExistingArchive: true

- task: CopyPublishBuildArtifacts@1
  displayName: 'Copy Publish Artifact'
  inputs:
    CopyRoot: '$(Build.SourcesDirectory)\ArtifactFiles'
    Contents: MySolution.Service.zip
    ArtifactName: MySolution.Service
    ArtifactType: Container

I have also checked shallow clone with this command git rev-parse --is-shallow-repository and the result is false

Hello @alexvaccaro,

You provided useful information by ruling out some likely causes.

It appears we are lacking details in the logs (even in DEBUG) to figure the problem.
We will try and reproduce the problem, thanks to all the details you provided, to get further information.

I’m personally off tonight until September. Someone else will take over this thread and will keep you informed on progress.

cheers,

1 Like

Hi,

We are still investigating this issue. So far, it seems to happen only on Windows agents, and not on Linux agents.
I’ll keep you posted.

HTH,
Claire

2 Likes

Hi,

There is an issue on our end, specific to WIndows agents. A fix is being validated right now and should be deployed this week. I’ll keep you posted here when the fix is available, for you to confirm that everything is back on track.

Thanks for your patience,
Claire

Hi,

A fix has been deployed today. Could you please confirm if the blame information is now working as expected?

Claire

3 Likes

Hi @Claire_Villard , yes, I can see the blame info now.
Many thanks to you and your team for fixing this issue.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.