An approach to know how frequently developers add automated tests in their codebases

I want to get visibility on how frequently or regularly do developers in an organisation write automated tests (unit tests) in the codebases they contribute to. I want to do this to understand the outcome of quality culture transformation being driven in the organisation to have more accountability from the developers for unit test.

We are using Sonarcloud but we are not blocking PRs if code coverage does not meet the quality gate, and we do not foresee that happening any time soon.

Questions

  1. Is it possible to get this kind of information from Sonarqube or Sonarcloud.
  2. If not, has anyone (reading this post) tried doing something like this in a different way for example by reading data from Git / Github?
  3. From your perspective, do you see value in fetching insights like this? Or is this pointless if we enable mandatory quality gates on code coverage?

Hey there.

I can offer the Sonar perspective:

SonarCloud might be able to help you track the number of unit tests over time if you feed test execution parameters. To be completely transparent with you – this is a feature that we do not like, and regularly try to deprecate/remove (it’s a purely quantitative metric that only indirectly relates to Clean Code).

It would let you see something like this – you can tell me whether or not that’s something that meets your needs!

Can I ask what’s holding you back?

Managers who do not believe in unit tests being part of the same PR as the feature code changes! :grin:

Hm… yikes!

Is it some version of this idea… or something else that motivates them?