I’ll tell you a little about my scenario, I have a pipeline that is responsible for obtaining the coverage of 12 *.csproj projects, but given the size it generates, it doesn’t publish everything:
##[warning]Skipping attachment as it exceeded the maximum allowed size 75MB:
Is there a way to have 12 pipelines, each with its own csproj and coverage, so the file would be smaller and could be published… I tried, but the coverage steps on me all the time, always leaving the last one in the global one,
Thank you so much
To be clear, is it analysis that’s skipping your coverage report, or something else in your pipeline?
but and it may be equally troublesome.
First, analysis is not additive. Each new analysis fully replaces the previous one. So you would not be able to analyze each
csproj individually into one, cumulative SonarQube project. For 12 different analyses, you would need 12 different SonarQube projects. (In commercial editions, you can stitch them back together with an Application.)
Beyond that, building is integral to analysis for C#. Which means that you’ll have to build each
csproj individually. My (sketchy) understanding of C# makes me think that in practical terms you may need to wait through the full build of all 12
csprojs each time, and use analysis exclusions for the 11 that are unwanted that time.
So… it may be easier to tackle the file-size side of this.
First of all, thank you for responding.
I tell you, we have a huge solution (.sln) in C# NET 6 with many projects (.csproj)
Sonaqube only allows me to have one project per Git repository (DevOps), so in my pipeline, in addition to compiling the solution and running some of the unit test projects, I end up shortening those projects to avoid the warning reporting… if I could split it and that each sonar process accumulates and does not leave the last one, it would be great…
I really tried everything, but in the end I was only left with running unit tests from some csproj.
Ehm… No? If you’re importing projects through the wizard, then I can see why you have that impression. But a SonarQube project is just… a unique
sonar.projectKey value. So on the CI side, you can set up analysis of subsets of your solution.
Does that help?
Capture the image to explain, which only allows you to import the repo once.
Yes, the wizard gives you a 1-to-1 because that’s what most people want. But all you have to do is set up additional CI jobs, each with a unique
sonar.projectKey value and the correct exclusions.
And… what edition are you using? If you don’t know, you can find it in your page footer. Because it strikes me that this is really “monorepo” functionality we’re talking about. If you’re on Community Edition or Enterprise Edition($$), then you’ll be fine, but what I’ve described will mess up PR decoration if you’re in Developer Edition($).
Developer Edition Version 10.2.1 (build 78527)
I’m going to try what you tell me.
thank you so much
Since you’re in Developer Edition, I need to tell you that what I’ve recommended will mess up your PR decoration on this solution.
When a PR crosses
csproj boundaries, then (race condition) the last
csproj/SonarQube project to complete analysis is what you’ll see in your DevOps platform PR decoration, replacing the ones that came before it.
Thanks for answering.
Look, for the moment for code coverage I just left a few test projects and that’s it.
I thank you for your assistance.