What is dotnet-symbolic-execution.exe?

Hello, I wonder, what is this process: dotnet-symbolic-execution.exe? For my project (it is quite big) it consumes 16GB of RAM (single pipeline).

We are using Azure DevOps.
SonarScanner version: 4.21.0.
SonarQube version: 8.7.1

I’m using only C# scanning (QPs for the rest of the languages have zero rules). The C# QP has 118 rules only (bugs and vulnerabilities).

Hello

This is an executable that is run for rule S3949. Please disable it and the memory issue will go away.

We’ve removed rule S3949 from SonarWay early this year because of its performance problems. We plan to rewrite it in a more efficient manner.

Also, please note that SQ 8.7.1 is not a supported version. Please update to SQ 8.9 LTS (long term support) or keep your SQ up to date with the latest version. Also, I strongly recommend to update SonarScanner for .NET to the latest version, too, to benefit from all the feature and performance improvements we make.

Thanks,

I’ll update my rule set.

Regarding SonarScanner version - it is being updated to the latest automatically by Azure DevOps.

Regarding SonarQube current version: we are an enterprise and have a dedicated team that controls server-side, I don’t have control over this. But from what I can see they do patches from time to time (though the process is slow).

BTW, are you aware about any C# rules that may significantly impact CPU utilization?

We have a build server with 24 Cores (48 logical processors) and 96GB memory. When we run more than 7 builds in parallel - CPU is constantly at 100% (when building the solution). W/o Sonar it works pretty well even on server with much less resources

We have a guide ( The SonarSource guide for investigating the performance of .NET analysis) which can help you identify long-running rules.

Please note that some of our advanced rules are finding tricky bugs and injection vulnerabilities, so even if they consume a lot of resources, they bring a lot of value. However we also have some maintainability rules which can consume a lot of resources on some particular codebase and which you can disable. Please feel free to discuss on this forum your findings and ask for guidance on how to improve the performance without losing a lot of value (the most performance-efficient way is not to run any static analysis at all :slight_smile: , but you wouldn’t benefit of all the help it brings ).

Roslyn (the compiler framework for .NET) runs code analyzers in parallel, so it can consume a lot of resources. Also, to demystify a bit our analysis - to find injection vulnerabilities, our taint analysis engine loads in memory the control flow graph representation of the entire project, so it can consume quite some memory.

Thanks a lot for the Guide, I’ll take a look at it later this week.

Roslyn (the compiler framework for .NET) runs code analyzers in parallel, so it can consume a lot of resources. Also, to demystify a bit our analysis - to find injection vulnerabilities, our taint analysis engine loads in memory the control flow graph representation of the entire project, so it can consume quite some memory.

Did you mean SonarAnalyzer rules are running in parallel? Or different analyzers (if project has ones)? Can I limit this concurrency?

Yes, SonarAnalyzer rules are independent between themselves, they are invoked during compilation by Roslyn, and they are run in parallel.

Please read the msbuild docs.

Thanks for the assistance!

The detailed logs said that 60-70% are consumed by “SonarAnalyzer.Rules.SymbolicExecution.SymbolicExecutionRunner” rules (in each project). Disabled them temporarily.

One more question, if you don’t mind: I have a lot of autogenerated files that I’d like to exclude from analysis. Saying exclude, I mean I wouldn’t like to run analyzers on them (to save time and resources). I know that I can use “sonar.exclusions” but the question is how this setting affect the analyzers? Will they still be running, but the results won’t be submitted to SonarQube server? Or analyzers when running take this filter into account and skip the files?

I looked at the SonarAnalyzer source code today and found out that a lot of rules don’t use “GenericCodeRecognizer”.

When I filter by means of dotnet_diagnostic I’m getting way better performance for one of the projects (with many resx files). But in this case I have to list ALL the rules. Can I somehow “wildcard” them?

Unfortunately, using sonar.exclusions only filters out the final results. The analysis still happens at build time. We’ve considered filtering at build time, but it would require quite a complex solution which would cost too much for the benefits it would bring (so we dropped the idea).

We exclude generated code by default (and you can enable generated code analysis in the SQ UI if you want - you may want to check first it’s not enabled there - see docs C# | SonarQube Docs).

If it doesn’t work as expected for your project, we would really appreciate feedback and sharing a generated file with us (via a private message, if you prefer) so that we can examine why it doesn’t get detected as generated.

Indeed that’s the class which detects whether the file is autogenerated, to skip analysis.

What rules have you found that don’t use it? All rules are registering callbacks through the helper methods in CSharpDiagnosticAnalyzerContextHelper.

Do you mean you filter out the paths for each rule in an editorconfig file?

IMO the solution for your problem is actually fixing the generated code detection, if possible, so I would focus on this.

These are simple resx file with underlying *.Designer.cs. And for some reason they are scanned (running locally, using SonarAnalyzer.CSharp.8.26.0.34506)

Yes, like this:
[*.Designer.cs]
dotnet_diagnostic.S100.severity = none
dotnet_diagnostic.S1006.severity = none
dotnet_diagnostic.S101.severity = none

Maybe I badly looked )

I’ve sent you a private message, it would help us to see a sample of such a file and to reproduce the problem.