We are working on a repo which contains a lot of legacy code, many source files are too costly to be covered in the unit testing, however when people doing small changes on those files, e.g. adding/removing some parameters from a funcion, they are failing the coverity criteria.
So our questions are:
is that possible to have such rule, in which if a source file doesn’t have any unit test covered in the base version, don’t gating on it for new commits. So any change made on it would pass the coverage test.
or if we could configure a rule so the coverage check would pass as long as the overall coverage rate doesn’t decrease. In this case the changes on non-covered codes shouldn’t failing the coverage check because those codes are not covered in the previous version.
SonarQube just reads the coverage report that it has been passed and then applies additional exclusions if they are configured. It’s not possible to configure something that checks whether or not the file has coverage already to determine whether it’s included.
I guess you could configure some kind of dynamic exclusion (e.g., based on whether a file has existing coverage), by writing a custom script that runs before the SonarQube analysis. This script could inspect the repo and the coverage report and update the sonar.coverage.exclusions property accordingly.
And, that seems like a lot of fuss. I’m not entirely sold that it’s a good idea to just ignore coverage on existing files because they’re old. What a good time to finally write tests! (I know…)
I’d like this, and we get a lot of request for a kind of “ratcheting” Quality Gate for coverage (only allow coverage to get better, not worse). I’ll flag this for a PM to look at and log accordingly.
Thanks for your insight! I can totally see your point here and as Colin explains it could be tackled with a fussy configuration which may not be ideal.
I have recorded your problem. Note that it is not something planned in the short-term roadmap but we will monitor it.