Generating reports from SQ

SonarQube enterprise edition (10.4.0), On-premise.
We want to generate reports (auto-generate using built-in report feature) to show issue counts, severity types, quality, etc. for projects/portfolios. So, what report feature that SQ provides that we can use for generating those types of reports (shows about issues)? Here is a sample report data that we would like to generate from SQ.
Also, what are the report features that come with Enterprise edition of SQ? Are there any additional report features that can possibly be enabled at some extra costs?
Thanks

Hi,

The docs list all the report types. Enterprise Edition($$) includes all the reports we offer. There are commercial plugins out there (not from us) that offer reporting, but I’m not familiar enough with them to tell you if they’re different.

It’s not clear to me what you’re after here, but I suspect you want a file plopped somewhere after each analysis? For that, you’ll need to do some automation. The reports are available from SonarQube at all times, but something needs to ‘push the button’ to get them. That could be a human, or it could be a post-job script that calls the API behind the button.

And if none of the built-in reports is suitable to your needs, you’ll need to call the APIs directly anyway to pull the data you want.

The best way to master the API is to perform the desired action via the UI and eavesdrop to see which calls the UI made to accomplish the action.

You may also find this guide helpful.

 
HTH,
Ann

Thank you, Ann for your response.
So, I’m exploring whether the SQ platform offers built-in reporting capabilities that include matrices categorizing issues such as Bugs, Vulnerabilities, and Code Smells, along with their severities: High, Medium, and Low, as well as overall project/portfolio/application quality. We aim to establish a reporting frequency aligned with our sprint cycles and subscribe to receive email notifications with these detailed reports. This approach enables us to monitor issue numbers, matrices, and track emerging trends effectively.

While the SQ documentation mentions Project and Application PDF reports, these reports primarily provide an overview or grading of the project or application status. However, our preference is for reports that offer granular insights, including issue numbers and associated matrices, empowering us to make informed decisions and drive continuous improvement efforts.

Thank you,

Hi,

When you say matrix, I picture a grid, and none of our reports have that.

But take a look at the “Regulatory” Report (sample uploaded (371.5 KB)). It’s pretty detailed, and may meet your needs.

 
HTH,
Ann

Hi,

I have looked at the regulatory report. Yes, that report seems very useful to us. But it is configured based on permanent branch.

I wonder, does the SQ provide similar report that we can configure weekly or bi-weekly basis and subscribe to email notifications?

Thanks,

~WRD0000.jpg

Hi,

By “permanent” do you mean the “keep when inactive” branches?
Or do you mean main? Because all “keep when inactive” branches are available:

It’s a download-on-demand. You should be able to script a regular pull of it.

 
Ann

Hi,

Indeed, I was referring to the ‘keep when inactive’ branches.

The Regulatory report serves as a valuable tool for capturing a snapshot of the release branch during product release, enabling us to verify whether the product aligns with release requirements regarding issue severity, count, and other metrics.

Within our sprint cycles, typically spanning one to two weeks, we aim to monitor issue trends such as bugs, vulnerabilities, etc., on a weekly basis. This allows us to gauge whether new issues are introduced or not.

For instance, considering a SonarQube (SQ) portfolio comprising 50 projects, each with its own GIT repository, I’m interested in tracking issue numbers from one week to the next. Can the Regulatory report facilitate this, and does it offer date/time filtering capabilities?

(Sorry, it’s a different topic)

I’m curious about the compatibility of SQ scanning results (issue counts, severity types, etc.) with other similar tools in the market. This is crucial as our end users/clients request a comparison of our product’s issues/bugs/vulnerabilities against the static code analysis tool they utilize in-house.

What’d be the answer about the compatibility?

Thank you,

With regards,

zh

Hi zh,

We don’t offer that in-depth reporting for aggregation - either Portfolios or Applications.

It’s always current-state, so you’d have to do your own math :smiley:

Yes, this is another question. I’ll give you a quick answer. If you want more depth, please do create a new thread. :slightly_smiling_face:

Compatibility is different from comparison. We don’t compare directly. We typically invite people to make their own comparisons. My understanding is that people are generally pleased with our results (altho maybe it’s just that those who aren’t stomp off and don’t tell us. :laughing:) Regarding compatibility, we import reports from multiple other tools, including SARIF reports, and our own generic import format.

 
Ann

Hi Ann,

Thank you much for clarifying the questions. It’s helpful.

With regards,

zh

1 Like