Help Figuring Out .NET Project Scan Taking Significantly Longer

Hi @trgabriel ! Given that you are using SQ LTS (6.7.5), you don’t have the injection rules (they’ve been added in SQ 7.2, in a separate plugin than the C# plugin), so the problem probably lies somewhere else.

Could you please provide the verbose logs for the build step and the scanner end step (docs here)?

1 Like

Do you have somewhere I can post the logs?

Sent log files through direct message.

1 Like

Thanks, I looked at the logs and didn’t find anything suspicious. One suggestion is to modify the logging only at error level, and by doing this you will probably gain some minutes - all Roslyn analysis messages are warnings (this is out of our control) - hence the huge log files.

I would have to profile the analysis to see if there are certain rules that take up the time, but for that I would need the source code. Please try keeping the log level at error and see what the perf increase is.

1 Like

What setting needs to be changed to reduce the log levels? Is this at the build level or at the sonarqube level?

I’m referring to the build logs. Roslyn analyzers run during build. See MSBuild Command-Line

1 Like

I saw a decrease of roughly a 30-60 seconds but that is also a small enough difference that it could just be an outlier. The dev team has also let me know that they would not want to do this as a long term solution for this problem.

Indeed, it’s not much gain.

One idea would be to disable the rules that are based on Symbolic Execution and we know they are more costly. These rules are S1944,S2259,S2583,S2589,S3655,S3900,S3966,S4158.

Another way to gain more insight in which rules are eating up the time is to use the -reportAnalyzer MSBuild option, and then disable the greedy rules for your project…

msbuild /p:reportanalyzer=true /v:d

If you run the build with reportanalyzer, please share the results with us :slight_smile: In the medium term, we want to gather more metrics on the performance of our rules, any help counts.

4 Likes

Disabling the rules described above resulted in a build time of 19:47. Roughly a 2 minute reduction. Will need to talk to the team if the rules disabled are worth the two minutes. Attempting to run the reportAnalyzer results now. Will send the results in a direct message if successful. I may need to push this back until Monday.

I tried adding both:
msbuildArgs: ‘-reportAnalyzer:True’
and
msbuildArgs: ‘-ReportAnalyzer:True’
to the VSBuild azure dev ops task, and neither worked. Do you know what I am doing wrong?

You need to pass the arguments the same way you would for the command line i.e.

image

1 Like

So I believe that worked, but how do I read the results?

The results are part of the MSBuild output. Search the log output for the step for the text Total analyzer execution time:. There will be one set of results for each MSBuild project.

1 Like

This seems to be the main slowdown.

Total analyzer execution time: 893.913 seconds.
NOTE: Elapsed time may be less than analyzer execution time because analyzers can run concurrently.
Time (s) % Analyzer
893.913 100 SonarAnalyzer.CSharp, Version=7.13.0.0, Culture=neutral, PublicKeyToken=c5b62af9de6d7244
448.313 50 SonarAnalyzer.Rules.CSharp.TokenTypeAnalyzer
445.228 49 SonarAnalyzer.Rules.CSharp.SymbolReferenceAnalyzer

What rules do these correspond to? Also, who should I send the metrics of the rule performance to? I will just cut out all of the similar blocks of text as what I pasted above.

Could you send them to @Andrei_Epure as before please? Thanks.

Interesting. Those analyzers don’t correspond to “real” rules, unfortunately, so I don’t think they can be disabled directly. They’re used to generate the metadata required for syntax and reference highlighting when browsing the code on SonarQube/SonarCloud.

I’m surprised they take so long because they don’t do anything complicated. Could you give us a rough idea of the number of lines of code in your solution please?

It appears to be a Test project. We don’t have an exact line count, but we believe it is around 5-10k lines of code. We made some changes to exclude it from the analysis, and this brought the scan down to 7-8 minutes. I think we can say that this is fixed unless you want to try to dig deeper into the cause of the slow down.

1 Like

I am glad you found a solution to the problem. I will try to find out if this behavior (TokenTypeAnalyzer taking lots of time) is consistent over the open source projects we analyze on a daily basis.

1 Like

Hi @trgabriel,

FYI we’ve created ticket #2929 to track the performance issue as we certainly wouldn’t expect the analyzers that generate the metrics to take nearly eight minutes each to run, especially against a relatively small project.

It’s interesting that the issue only appears for a single project. It may be that there is something about the content of that specific project that is causing a perf issue e.g. a few exceptionally large files (possibly generated), or perhaps hundreds of small files.

If you could share the source of just the test project with us privately that would be ideal (although I realise that’s unlikely unless the application is open-source). Otherwise, if you could give us an rough idea of the number of files in the project and the number of lines of code in the largest file, that would be very helpful.

Thanks,
Duncan

2 Likes

@duncanp Sorry for the late reply, I do not believe I am allowed to share the source code, but I can say that it was a test project that was not getting ignored by the scanner. That folder/project is about 48 files. The three largest files are roughly 1750, 950, and 940. Everything else is under 400.

2 Likes

No problem. Thanks for the additional information @trgabriel.