Second sonar quality gate check is over 1000% slower than first for no apparent reason

The second and later sonar quality check for a branch or PR is for us more than 1000% slower than the first build. Even if the code/branch stayed the same. The first checktakes currenly usually less than 1 minute and the later ones more then 10-20 minutes. To get the orignal speed we can manually delete the branch/PR and then the next build is fast again. Also forcefully changing the branch each build also speeds it up, but then there is no real advantage of buying a developer edition when you can’t properly use the branch/PR feature.

We use sonar developer (9.8.0.63668) edition and use a reference branch for new code detection. The affected project is a C# project and it is getting worse and worse over time.

The extra time is being spend in the “Detect file moves” step. According to the log it also detects there 4 file additions in a case where there was just 1 commit since the reference branch with just a single file changed but no file added.

Since it only occurs at the first build and not on the second it is obviously a bug. One that makes us dislike sonar more and more since it existed very long and it got worse and worse over time and currently it slows down our reviewing and merging process.

I provided log files with exact times and relevant log files, but I don’t want to post those at a public place like this.

Please contact me privatly if you need those.

Hi,

Welcome to the community!

Can you characterize your project in terms of size? How many

  • How many LoC?
  • How many files?
  • How many LOC in the largest file?
  • in the average file?

 
Thx,
Ann

Hello Ann,

thanks for your reply. In regards to your question:

35k LoC
1k files
6k LOC in largest file
35 LOC in average file

As mentioned, the interesting part is that the performance of the sonar quality check is good enough on the first quality gate check of a branch/PR. So the project size can not be the issue. The issue is that sonar seems to do something uncessarly extra on second and later quality checks of the same branch/PR that takes a lot of time.

Hi,

Thanks for these details. Can you try excluding that 6k LOC file? (Not forever, but as a test?) Because actually, I think this is about size. You’ve already narrowed this down to the file move detection step. I think that algorithm is choking on the large files.

And you’re not seeing the slowdown in the second analysis because there’s nothing to compare moves against.

 
Ann

To clarify: The slowdown is only not visible in the first analysis, second and further analysis have the slowdown.

I still find it strange because it is not supposed to compare with the last analysis but with the reference branch which hasn’t changed (Based on our setting for new code detection).

So it might be probably even a very simple fix for your developers as they just need to remove that uncessary call if new code detection is based on a reference branch and not on previous analysis.

The file can not be easily excluded as it is an IDE generated file with translation text based properties that get references from almost everywhere. I also think it is very common for C# projects to have those.

Hi,

Let’s say you rename A.cs to B.cs. Should all its old issues show up as new? Probably not. That’s what file move detection is about. It’s not something we can just turn off.

Obviously, you can’t not build it. But we always advise excluding libraries and generated files from analysis. After all, if an issue is raised in this file, what are you going to do about it? Rewrite the generator? Probably not. I suggest you set a file exclusion and retry.

 
Ann

Yes, I understand that you need to keep track what issues got marked as resolved. However this is also true for issues marked as resolved in the reference branch. Since however this is fast for the first branch/PR build, it should also be fast for the second build. It should not make a difference if this information is taken from a reference branch (first build) or the previous build (during second build).
=> So I still think there is some kind of bug causing this

Thanks for the proposal and exclusion link.

I tried defing more exact what needs to be analyzed with a sonar.properties which was mentioned in the file exclusion article you linked. The properties file was rejected by the build with the error “sonar-project.properties files are not understood by the SonarScanner for MSBuild. Remove those files from the following folders:”
So I searched and found out that apparently I am supposed to use a SonarQube.Analysis.xml. I tried then such a file with a content like this (but with about 5-20 more directory entries):

<SonarQubeAnalysisProperties  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://www.sonarsource.com/msbuild/integration/2015/1">
  <Property Name="sonar.sources">path/to/directory1/,path/to/directory2/</Property>
  <Property Name="sonar.tests">path/to/directory4/,path/to/directory5/,path/to/directory6/</Property>
</SonarQubeAnalysisProperties>

I tried to avoid that way the directory where the large file is located and also some buid directories. It may have speed up the build by about 10% from about 15 times slower to 13-14 times slower. To early to tell yet though.

So thanks for the idea, but it is not the solution yet.

Hi,

This is not at all about that. File move detection ensures that when you rename a file or move it to a different directory we don’t close all the issues on the old location and open brand new ones on the new location.

Using SonarScanner for .NET, you shouldn’t manually set sonar.sources or sonar.tests. As I stated above, you should set an exclusion that’s narrowly targeted to exclude your generated file(s).

IMO setting an exclusion via the UI is the best way, but it can also be done via analysis properties.

 
HTH,
Ann