Performance issues in .net 8 project

Hi there

We encounter some performance issues since updating our build-pipeline images.
Before we did a PR build in 15 min. now it takes up to 40-45 min. if we disable Sonarqube the build will be as fast er before. (less then 10 min).

Environment.
Build running in bitbucket pipeline.
sonar version:

SonarScanner for MSBuild 9.0
Using the .NET Core version of the Scanner for MSBuild
Pre-processing started.
Preparing working directories...
07:04:04.034  Updating build integration targets...
07:04:05.227  Using SonarCloud.
07:04:05.561  Fetching analysis configuration settings...
07:04:06.836  Provisioning analyzer assemblies for cs...
07:04:06.836  Installing required Roslyn analyzers...
07:04:06.837  Processing plugin: csharpenterprise version 10.1.0.102936
07:04:07.304  Processing plugin: vbnetenterprise version 10.1.0.102936
07:04:07.479  Processing plugin: securitycsharpfrontend version 10.8.0.33361

Using this guideline: https://community.sonarsource.com/t/the-sonar-guide-for-investigating-the-performance-of-net-analysis/47279 we found some rules that uses allot of time in the build process:

Project 1. 
5563.793   99   SonarAnalyzer.CSharp, Version=10.1.0.0, Culture=neutral, PublicKeyToken=null
3885.882   69      SonarAnalyzer.Rules.CSharp.VariableUnused (S1481)
1599.255   28      SonarAnalyzer.Rules.CSharp.UnnecessaryUsings (S1128)
14.679   <1      SonarAnalyzer.Rules.CSharp.DisposableNotDisposed (S2930)
13.739   <1      SonarAnalyzer.Rules.CSharp.CertificateValidationCheck (S4830)
3.955   <1      SonarAnalyzer.Rules.CSharp.CommentedOutCode (S125)
3.931   <1      SonarAnalyzer.Rules.CSharp.CommentsShouldNotBeEmpty (S4663)
3.806   <1      SonarAnalyzer.Rules.CSharp.DeadStores (S1854)


Project 2. 
3888.302   97   SonarAnalyzer.CSharp, Version=10.1.0.0, Culture=neutral, PublicKeyToken=null
1562.977   39      SonarAnalyzer.Rules.CSharp.UnnecessaryUsings (S1128)
978.668   24      SonarAnalyzer.Rules.CSharp.InsecureEncryptionAlgorithm (S5547)
975.876   24      SonarAnalyzer.Rules.CSharp.VariableUnused (S1481)
162.236    4      SonarAnalyzer.Rules.CSharp.DisposableNotDisposed (S2930)
51.053    1      SonarAnalyzer.Rules.CSharp.ExecutingSqlQueries (S2077)
24.967   <1      SonarAnalyzer.Rules.CSharp.PasswordsShouldBeStoredCorrectly (S5344)

For now i i’m looking into disable the most expensive rule in the related projects. but I’d rather not do that if this can be solved with a performance fix.

Thanks Niels

Hi @nielsnocore

Welcome to the Sonar community!

What SonarQube version and edition are you using?

What other steps from the “Troubleshooting help” section of the performance guide did you follow?

What’s the hardware configuration you are currently using for your machines? Did you try and improve that?

Did you investigate the scenario of doing the build twice between the begin and end step (which might happen if you run dotnet build and dotnet test)?

Also, on Pull Request analysis, we do incremental analysis which should be faster.

Hi Andrei Epure

We are using SonarQube Cloud.

hardware
we are running our build in the bitbucket pipeline with a 4x size: ( 8 dedicated CPU’s and 16 gb memor). the next step would be a x8 setup, but to prevent large costs we would like not to go to x8.

What we did/consider

  • Updated sonarscanner versions
  • updated .net SDK/msbuild
  • Getting logs using: -p:reportanalyzer=true in dotnet build.

To prevent double analysis we already did:

dotnet build
dotnet test /p:CollectCoverage=true --no-build .....

using the logs we where able to find some long running rules. (as in the original post)
after the original post i was able to get some more information.

I disabled the rules. (S1481, S1128, S5547) in the 2 project using the .editorconfig (of a total of 50 projects in the solution) which resulted in the Pull Requests build where down to 10 min, (4 min dotnet build)

In the binlog some other rules where still time consuming. in those 2 project we have a couple large files (e.g. 22K LOC).
When we marked thest files with // <auto-generated /> and enabled the rules agian. the build was still to 10 min (4 min dotnet build).

These files exists a for a long time, without any build issues before.
So i suspect something changed recently which cannot handle the larges files.

Ok, so do I understand well that by excluding the large files, you can keep S1481, S1128, S5547 enabled?

Yes that is true,
i have removed all disabled rules from de .editorconfig and the dotnet build is still running in 4 a 5 mins.

the large files are data seed files on only look like this (1 static array with data)

public static class CollectiveAgreementSeed
{
    /// <summary>
    /// The collective agreements.
    /// </summary>
    public static readonly CollectiveAgreement[] CollectiveAgreements =
    {
        new CollectiveAgreement()
        {
            Id = new Guid("19d065c2-897a-886c-92d1-8cc2b680bca6"),
            TaxCode = 4,
            Description = "CLA1",
            StartDate = null,
            EndDate = new DateTime(2016, 12, 31, 0, 0, 0, DateTimeKind.Utc),
            IsFlexAgreement = false,
        },
        new CollectiveAgreement()
        {
            Id = new Guid("65304a5f-7d2f-5460-9f48-c36e7495a544"),
            TaxCode = 5,
            Description = "CL2",
            StartDate = null,
            EndDate = null,
            IsFlexAgreement = false,
        },
        new CollectiveAgreement() ....
// ect.
// ect. 
// ect.
// until row 22K
1 Like

Thanks. We know that Roslyn analyzers have these problems on big files. Our recommendation is to exclude them from the analysis by marking them as auto-generated.

One possible ux improvement would be to automatically detect such files and do the exclusion ourselves, to avoid users needing to investigate such problems. I will raise this to our Product Manager to take into account for the future.

This is the more relevant part. Did the SQ Cloud analysis pick these files before? Did you see them in SonarQube Cloud, see issues on them?

When did the regression happen?

What was the update you did?

Our recommendation is to exclude them from the analysis by marking them as auto-generated.

That is what we did for now, and that is working.

Did you see them in SonarQube Cloud, see issues on them?

We did exclude them using by adding them to sonar.exclusions. but this did not exclude them from being analysed (as your performance article mentioned) we did not see them in SQ Cloud.

What was the update you did?

we updated to the then (10/10/2024) latest version of

  • dotnet/sdk:8.0
  • openjdk-17-jre
  • dotnet-coverage
  • dotnet-sonarscanner

Hi @nielsnocore - sorry for coming again a bit late to ask more questions.

And what did you have before on the build machines? What was the previous configuration in which the SonarQube Cloud analysis didn’t manifest this perf problem?

Thanks!

@Andrei_Epure

The configuration before (when we did nog have the issues was)

  • SonarScanner for MSBuild 6.2 (now 9.0)
  • csharp version 9.32.0.97167 (now 10.2.105762)
  • securitycsharpfrontend version 10.7.0-M1.32475 (10.9.0.33961)
  • .NET SDK 8.0.6 (or 8.0.7) now (.NET 8.0.10)

If you need some more info let met know.

Hi @nielsnocore

Was the big class (that you now excluded) in your codebase before you did the updated of the Scanner for .NET?

And if so, was it analyzed before? (with Scanner for .NET 6.2 and csharp version 9.32.0.97167)?

Hi,

It was in de codebase before the update for quite some time.

Before it was not excluded, only for code duplications using te sonar.cpd.exclusions setting.

Niels

Thank you Niels for all the details and the code snippet. Based on the information provided, I could easily create a reproducer and confirm the performance problem.

I’ve created an internal ticket NET-910 but unfortunately I cannot provide an ETA at this moment.

1 Like