SonarQube enterprise edition (10.4.0), On-premise.
We want to generate reports (auto-generate using built-in report feature) to show issue counts, severity types, quality, etc. for projects/portfolios. So, what report feature that SQ provides that we can use for generating those types of reports (shows about issues)? Here is a sample report data that we would like to generate from SQ.
Also, what are the report features that come with Enterprise edition of SQ? Are there any additional report features that can possibly be enabled at some extra costs?
Thanks
Hi,
The docs list all the report types. Enterprise Edition($$) includes all the reports we offer. There are commercial plugins out there (not from us) that offer reporting, but Iām not familiar enough with them to tell you if theyāre different.
Itās not clear to me what youāre after here, but I suspect you want a file plopped somewhere after each analysis? For that, youāll need to do some automation. The reports are available from SonarQube at all times, but something needs to āpush the buttonā to get them. That could be a human, or it could be a post-job script that calls the API behind the button.
And if none of the built-in reports is suitable to your needs, youāll need to call the APIs directly anyway to pull the data you want.
The best way to master the API is to perform the desired action via the UI and eavesdrop to see which calls the UI made to accomplish the action.
You may also find this guide helpful.
HTH,
Ann
Thank you, Ann for your response.
So, Iām exploring whether the SQ platform offers built-in reporting capabilities that include matrices categorizing issues such as Bugs, Vulnerabilities, and Code Smells, along with their severities: High, Medium, and Low, as well as overall project/portfolio/application quality. We aim to establish a reporting frequency aligned with our sprint cycles and subscribe to receive email notifications with these detailed reports. This approach enables us to monitor issue numbers, matrices, and track emerging trends effectively.
While the SQ documentation mentions Project and Application PDF reports, these reports primarily provide an overview or grading of the project or application status. However, our preference is for reports that offer granular insights, including issue numbers and associated matrices, empowering us to make informed decisions and drive continuous improvement efforts.
Thank you,
Hi,
When you say matrix, I picture a grid, and none of our reports have that.
But take a look at the āRegulatoryā Report (sample uploaded (371.5 KB)). Itās pretty detailed, and may meet your needs.
HTH,
Ann
Hi,
I have looked at the regulatory report. Yes, that report seems very useful to us. But it is configured based on permanent branch.
I wonder, does the SQ provide similar report that we can configure weekly or bi-weekly basis and subscribe to email notifications?
Thanks,

Hi,
By āpermanentā do you mean the ākeep when inactiveā branches?
Or do you mean main? Because all ākeep when inactiveā branches are available:
Itās a download-on-demand. You should be able to script a regular pull of it.
Ann
Hi,
Indeed, I was referring to the ākeep when inactiveā branches.
The Regulatory report serves as a valuable tool for capturing a snapshot of the release branch during product release, enabling us to verify whether the product aligns with release requirements regarding issue severity, count, and other metrics.
Within our sprint cycles, typically spanning one to two weeks, we aim to monitor issue trends such as bugs, vulnerabilities, etc., on a weekly basis. This allows us to gauge whether new issues are introduced or not.
For instance, considering a SonarQube (SQ) portfolio comprising 50 projects, each with its own GIT repository, Iām interested in tracking issue numbers from one week to the next. Can the Regulatory report facilitate this, and does it offer date/time filtering capabilities?
(Sorry, itās a different topic)
Iām curious about the compatibility of SQ scanning results (issue counts, severity types, etc.) with other similar tools in the market. This is crucial as our end users/clients request a comparison of our productās issues/bugs/vulnerabilities against the static code analysis tool they utilize in-house.
Whatād be the answer about the compatibility?
Thank you,
With regards,
zh
Hi zh,
We donāt offer that in-depth reporting for aggregation - either Portfolios or Applications.
Itās always current-state, so youād have to do your own math ![]()
Yes, this is another question. Iāll give you a quick answer. If you want more depth, please do create a new thread. ![]()
Compatibility is different from comparison. We donāt compare directly. We typically invite people to make their own comparisons. My understanding is that people are generally pleased with our results (altho maybe itās just that those who arenāt stomp off and donāt tell us.
) Regarding compatibility, we import reports from multiple other tools, including SARIF reports, and our own generic import format.
Ann
Hi Ann,
Thank you much for clarifying the questions. Itās helpful.
With regards,
zh

