we are currently introducing SonarQube Enterprise (v9.2.3) to all our projects.
My managers would like to see something like a yearly report about statistics, what the tool actually helped us over time. Especially interesting are things like: how many vulnerabilities and security hotspots have been identified, which criticality did these have, how many of these got set to won’t fix, false positives, how many got fixed etc. Ideally the same for other Bugs as well.
This should be available ideally over all projects in the defined timeframe as management summary. Maybe also an overview per project and/or portfolio, to give deeper inside about different project teams need for trainings.
Extra bonus would be information about how long it took in average for a project to fix things per criticality.
Management would like to use this kind of statistical data to evaluate yearly license renewal, or better said: it should help me to proof that the tool actually helps us and how much, when requesting money for license renewal.
Is there a feature like this available or planned? I couldn’t find something similar so far unfortunately.
Thanks in advance
Welcome to the SonarSource community. . I hope you’ll enjoy it.
Although almost everything is possible with SonarQube (with a bit of scripting and use of the Web APIs) I have quite some reservations on the approach.
There seem to be too much emphasis on reporting and, by opposition, not enough on fostering the good use of the product to write clean code.
Such approach is likely to not prompt the correct behaviour of the devs. If they are measured on how much they fix, well:
- They will go for the issues easiest to fix, that will make their reporting look good, rather than the most important issues (and severity is not the first/only criteria to determine the importance of a fix, see the Clean As You Code approach that we promote with our product)
- They will try to hide problems rather than solving them (False Positive or Won’t Fix are not the only options to devs, there are tricks more difficult to track and that can cause problem to fall under the radar)
- They will be tempted to commit without much attention, “artificially” generating issues that they can fix on a 2nd commit, fix the sake to increase the stats on number of issues fixed…
The sooner a issue is fixed the cheaper and safer it is to fix (because it’s fixed when the dev is working on that particular piece of code, with all the context in his head). As such they may ignore issues that SonarLint warn them about, because when fixing a issue raised by SonarLint it’s like fixing an issue before it formally exists (the fix comes before commit in the SCM). That’s a behaviour against the shift left principle that we are very convinced about.
The above are only 2 examples, but by experience there are tons of undesirable behaviours prompted by the approach (to circumvent the reporting).
So, first a strong advice:
- Go back to your management with this feedback:
– A bit of reporting is good, too much reporting will shift the devs attention to the wrong focus (show good stats instead of write clean code)
– Devs should feel SonarQube is a friendly tool to help them, not something potentially used by management to scrutinize their work or (wrongly) measure their performance (the measurement is by experience always biased and incorrect). If they feel scrutinized, the adoption will be (s)low and your code quality will not improve as fast as it could
– Writing clean code is a no brainer. You have to be convinced it’s a must. It does not have to be backed by a made up RoI. Otherwise your code that is a great company asset will inevitably some day turn into a liability (verbatim from the words of our CEO).
OK, now that you have the SonarSource’s vision on this, let’s answer your technical question:
- The features we plan are aligned our vision so there is nothing and there will be nothing in the foreseeable future that could turn away the focus of devs on writing clean code (in a nutshell, there will be no extensive out of the box reporting, beyond what we have right now ie the Portfolios and Projects PDF reports).
- However with the API there is a lot that you can extract from the product that could help you achieving some of the reports you mentioned. The key APIs that you want to look at are:
api/hotspots/search to extract issues/hotspots of projects
api/measures/component to extract measures/metrics about projects
api/measures/search_history to extract the history of metrics/measures of a project
You can have a look at the details of these APIs. The APIs documentation is accessible by clicking on the ? icon top right of your SonarQube platform.
That’s a great take on this, and I wish every CEO would think that way, but in reality this is unfortunately really rare. The goal for a company like SonarSource also supports this thinking much easier than the goal for most corporations, where the primary objective is shareholder returns. As such, looking at the RoI is important.
From a traditional company perspective, SonarQube works a bit like an insurance policy. You invest into a license and into development time to lower your risk and the security exposure of your code down the road. Unfortunately, unlike an insurance policy, SonarQube won’t pay out anything if you miss a vulnerability and your data is leaked anyway. And it’s very hard to argue “well, it probably would have leaked sooner if we hadn’t invested in this”.
I’m in a similar situation as Ralf, and as useless as this kind of data is from a developer’s perspective, it is very useful to make a business case for the SonarQube license. The more SonarQube can help us easily gather this data, the more likely we are to keep the license alive and get funding for clean development, rather than quick code delivery.
Thank you @OlivierK for the detailed information.
Chris summarized the situation pretty well I think.
I wanted to add as well:
I totally agree, writing clean code in general is a must, not an option. And also our management and developers will agree to this.
Maybe also I didn’t outline the intention good enough, so let me clarify that:
The goal is not really to measure developers themselves. It’s rather to demonstrate what SonarQube is possible to do for us. Example: If one outcome is, that there is almost no relevant findings in general or specific projects, there is 3 options: Either SonarQube isn’t set up correctly, or it’s not doing a good job, or our developers are just too good ;-). That would have to be analyzed then. On contrary, if we are getting a high number of issues, it’s again 3 cases, with the first 2 being the same and the third being that our developers seem to need more training in general.
So, the only “bad” (or better said good?) outcome for specific dev teams might be to be involved in additional trainings, but this is actually just a side-benefit if the data would show this.
Most management I worked with want to see what they get for their investment, including software licenses bought. It’s not a question if we need such tool or not. The necessity is clear. But the question is: Is it the right tool for us? And do we use it correctly? The more a tool can proof it’s own value in a management overview, the easier it is to answer these questions in favor of the tool.
So, even though I understand your vision about this, I would also think that you have an interest in proofing the value of your product. The APIs may help, I will look into these later, but maybe you should also consider to at least provide some kind of information like this in a general report as well. I’m pretty sure that could help more people than just myself, and not aiming to force devs to change their behaviour.
Hello @rkg and @cba,
I am glad that you agree with our vision and also I am not surprised by your feedback that, especially in large companies, there is a need for some sort of “ROI” for the management.
I just had to make my (our Company) point before discussing the options.
As I wrote above, I doubt that we will have anything related to ROI or “extended” reporting out of the box in a reasonable timeframe:
- We have a lot of exciting stuff that we want to add in our product for the future (that clearly needs even more developer bandwidth that we have today), the above is clearly not a top priority
- When speaking of “extended” reporting, typically every company has different requirements/opinions on what should be reported, so unless we would make that reporting highly configurable (which makes the feature more complex) we would not be able to deliver something that would fit the needs of the majority.
But you can do anything you want with the APIs (see the 3 key APIs I listed in my previous post. You may also look at community tools that may provide out of the box some capabilities that may correspond to your needs, for instance the bitegarden plugins or sonar-tools
Note: My suggestion of the above tool is… just a suggestion. It’s not an endorsement or any guarantee of suitability, quality or stability. They are absolutely not affiliated to SonarSource. There may be other tools that are more suitable for you. I let you google that.