we are doing first tests with SonarQube v2025.1, standard experience.
I did a new scan for a project and afterwards downloaded the regulatory report for the newly updated branch. For some reason the csv files that should contain the open findings on new and open findings on overall code are incomplete or completely empty although there are issues in the respective branch.
In the attached screenshots you can see that there are lots of issues in the branch but only one is listed in the csv. For other projects the csv was even completely empty.
thanks for the fast answer. Yes, this could be the cause. At least it would match from the numbers.
But I think this does not make much sense. Indeed it makes the report useless for us.
Every auditor wants to see all open issues and not an already filtered list which is incomplete. This only causes questions and confusion.
We have a lot of Code Smell rules in our Quaility Profile which have severity Critical or even Blocker and I see no reason for donāt listing the related issues there.
More reasons against removal of Code Smells:
The csv file is called āopen_findings_on_overall_code.csvā which gives no hint that it is not a complete list of all issues.
The quality_profile.csv file contains all rules, also the ones related to code smells.
The report.pdf shows the complete number of open issues. Who should understand why some of them are missing in the list of issues?
Why are code smells not included in the issues lists?
Thanks for the feedback. Iāll pass it along to the right PM (and make sure the documentation accurately reflect how it works today).
For what itās worth, the justification when the feature was developed was;
Currently, only operational risks are logged in the list of findings: bugs, vulnerabilities, and hotspots. Code smells are excluded from the list to not generate confusion about the risks introduced by the release.
We are currently reviewing our feedback on Regulatory Reports to bring some improvements in the next quarter. I will take this into our insights to review further and see if we need to adjust the report or the communication about what it reports on.
I would love to hear more from you regarding this topic, or any additional feedback you might have, if any (whether by text post or scheduling a chat). Would you be interested? If yes, I am sharing a calendar booking link here.
Good to hear that you are interested in our feedback to improve the regulatory reports.
I will think about it, prepare something and give you some feedback at the end of the week.
I did some tests and now I am ready to share some feedback regarding the content of the zip file. Most important thing for us would be to include all issues in the csv files which contain the issues lists. I added some more things that Iād like to see improved/changed.
open_findings_on_new_code.csv and open_findings_on_overall_code.csv
The names are currently misleading. The files should really contain all issues. Why exclude some?
The pdf in the Project Ratings section also shows the complete number which is confusing.
We have a lot of Code Smell rules in our Quality Profile which have severity Critical or even Blocker and I see no reason for donāt listing the related issues there.
Every auditor would also expect to see a full list of issues and not have some filter already applied.
Same applies to the resolved findings lists.
analysis_parameters.txt
sonar.sources seems to contain a list of all sources but only the very first source files are shown. The rest seems to be cut off. I donāt see much additional value in this information.
quality_gate.csv
The csv could additionally contain the name of the used Quality Gate
quality_profiles.csv
In addition to the information for every rule it could also contain the name of the used Quality Profiles.
The Impact columns are probably not from āClassic Modeā and are a bit confusing as we can not see them in the GUI.
report.pdf
In the Files section the name of the pdf report itself is wrong or the file regulatory_report_summary.pdf is missing in the zip file.
In the Failed Conditions section the size of the text is normal if no condion is failed but very small if the failed condition is printed.
The report itself currently does not contain the information that the included issues lists are not complete - but anyhow in our opinion the lists should be adapted to contain all issues.
Please let me know if something is unclear or additional feedback needed.