Sonarcloud analysis registration not always cleaned up properly

Hi Duncan,

Thank you for the explanation. I’ll try adding a task that runs regardless of the job’s result to always try to remove the .sonarcube folder at the end of the job performing the analysis, to see if this avoid the leaking of sonarcube analysis. I’ll get back to you with the results

So, I’ve tried removing the folder, but I cannot, because apparently while the job is running, the files in this folder are in use:

So it seems that some process is still running and locking the files in this folder after the code analysis is done. What are our options here?

  • Cleaning the workspace is not really an option, because we can only clean the workspace at the start of the pipeline, so if I clean the workspace of the job using the analysis, I don’t solve any issue in subsequent jobs not doing anything with sonarcloud.
  • Adding a script step to remove the .sonarqube folder at the start of the pipeline also has no effect, unless I do this for ALL pipelines we have, which is not a viable option

It is starting to become quite a hassle to get this integration to properly work. I don’t think it should be our responsibility to clean up stuff that sonarcloud tasks are leaving behind after a build job.

I’ve now changes the cleanup task to remove only the $(Agent.BuildDirectory)\.sonarqube\bin\targets folder, which doesn’t cause “file in use” issues. Interestingly, in the very first run, this folder still existed even though the analysis task ran and finished correctly.

I’ll merge this change, and see if the leakage now stays away

@PaulVrugt

It is expected that the folder exists even if the analysis task runs to completion. It is only the contents of the folder that are removed, not the folder itself.

So, I’ve tested it and I have results very quickly. I have an agent that ran a job that uses sonarcloud integration, including the step to remove the sonarcloud folder ($(Agent.BuildDirectory)\.sonarqube\bin\targets). This job ran without problems, all was green.

The the same agent ran a subsequent job that didn’t use sonarcloud integration. The sonarcloud analysis indeed did not fire, but it did start spitting out warnings like:
Sonar: (xxx.csproj) The analysis targets file not found: s:\a\1\.sonarqube\bin\targets\SonarQube.Integration.targets

so there is still stuff left over from the previous job. I think there is something seriously wrong with the msbuild sonarcloud integration. If you are not going to fix this, I’m now thinking maybe I can do a CLI integration instead of msbuild integration for the .net analysis. However, I think this will require me to manually specify all projects that are to be scanned, since I cannot use a solution file like doing with msbuild. Is this a viable alternative for analysing a .net solution and avoiding these leaks?

@PaulVrugt

As we already mentioned, the SonarQube.Integration.ImportBefore.targets will try to look now for the SonarQube.Integration.targets and it will not find it as you have successfully deleted it. It will log this line you mentioned and will not interact with your build anymore.

It’s an MSBuild message, not a warning. As @Caba_Sagi said, the presence of the “stub” targets will have no other effect on non-analysis builds apart from the message.

The Scanner for .NET is effectively a wrapper around the scanner CLI, so you could use the Scanner CLI directly if you replicate everything the Scanner for .NET does i.e.

  • create a ruleset/.editorconfig and SonarLint.xml file based on the current Quality Profile for the project
  • tweak the build so those generated files are used
  • make sure the Sonar C#/VB.NET analyzers are executed in the build step (they are written as Roslyn analyzers which means they need to be executed by the compiler)
  • create a sonar-project.properties that specifies the source and test files to be analysed, and the paths to the Roslyn issue log files and any code coverage files.
  • executes the Scanner CLI.