How can we optimize C# analysis? It takes more than 1 hour

Version: 8.6 (Docker)

We use SonarQube in our GitLab CI pipeline.
We analyse C# code with the following script in te pipeline:

dotnet sonarscanner begin /k:${SONAR_PROJECT} /d:sonar.host.url=${SONAR_HOST_URL} /d:sonar.login=${SONAR_TOKEN} /d:sonar.cs.opencover.reportsPaths=$test/TestResults/coverage.opencover.xml /d:sonar.cs.xunit.reportsPaths=$test/TestResults/d:sonar.gitlab.commit_sha=%CI_COMMIT_SHA% /d:sonar.gitlab.project_id=%CI_PROJECT_ID% /d:sonar.gitlab.ref_name=%CI_COMMIT_REF_NAME% /d:sonar.scm.exclusions.disabled=false /d:sonar.exclusions="Project/Assets/**, Project/Migrations/**"
dotnet build
dotnet test --collect:"XPlat Code Coverage" -- DataCollectionRunSettings.DataCollectors.DataCollector.Configuration.Format=opencover
dotnet sonarscanner end /d:sonar.login=${SONAR_TOKEN}

Are there any optimalizations to perform with the commands?

Hi @MichelT

Let’s try to find out the exact bottle neck in the analysis.

  • please share the verbose output of the END command (please run dotnet begin /k:“MyProject” /d:sonar.verbose=true as the begin step, and please attach the output of END step)
  • please share the verbose output of the MSBuild command msbuild /p:reportanalyzer=true /v:d > build.log