Generic Test: Where can I find message and stacktrace in the SQ UI?

Hello there,

I am using the Generic Test Execution Report Format and have created an XML file according to the description: Generic Test Data | SonarQube Docs

There are the XML tags:

  • message (mandatory): short message describing the cause
  • stacktrace (optional): long message containing details about failure|error|skipped status

Where can I see the values of the XML report in the SQ UI?

Regards,

Hi,

Ehm… I’m not sure you can anymore. We stripped some of that execution data out several years ago. If it’s still available anywhere, I’d start in the Measures page, with the Cvoerage → Tests → Errors listings.

 
HTH,
Ann

Hi Ann,

I’d start in the Measures page, with the Cvoerage → Tests → Errors listings.

I did a test, the values are not visible in the UI.

Looking into the code you are still reading the value

But seems not to use it?

The API doing this is marked deprecated:

import org.sonar.scanner.deprecated.test.DefaultTestCase;
import org.sonar.scanner.deprecated.test.DefaultTestCase.Status;
import org.sonar.scanner.deprecated.test.DefaultTestPlan;
import org.sonar.scanner.deprecated.test.TestPlanBuilder;

When do you plan to update the Generic Test sensor and the API?

Regards,

Hi,

I’m not aware of any plans to update the Generic Test format or the API.

 
Ann

Hello Ann,

Why does this feature has been deprecated ?

My team and I are working with Azure DevOps CICD pipelines for a C++ project and it would have been really helpful to see Unit Tests results in Sonarqube.
We do have errors and success metrics but for more details (which ones, reasons, etc) we are forced to read pipelines logs, as we have a lot of small repositories difficult to have them both in local.

Best regards,

Hello Ann,

We are also wondering why we are not able to see details neither, as it was possible in version 6.7 (at least this one).
We were expecting to find some details like shown on screenshot of documentation Seeing Tests - SonarQube-6.7 (end of page):

But instead, we can see something less detailed like this:


Useful for metrics but less for Unit Test reports

Best regards,

Hi @DavidAuger,

Welcome to the community!

SonarQube 6.7 was a long time ago. A lot has changed since then.

I’ll be honest & say that as an organization we’ve never really believed in those test metrics. Why? Because we feel that if you’ve got failing tests, the pipeline should stop then & there. You shouldn’t just count the failures & move on. That’s why we’ve slowly but surely been moving away from these metrics.

 
HTH,
Ann

Hi Ann,

In case you don’t wanna fix it you should at least update the documentation.

Regards,

Hi @guwirth,

What do you think should be fixed / updated in the docs?

 
Ann

Hi Ann,

What do you think should be fixed / updated in the docs?

These two XML tags are no more supported:

  • message (mandatory): short message describing the cause
  • stacktrace (optional): long message containing details about failure|error|skipped status

You should give at least a hint that they are not visible in the UI.

Regards,

Hi,

Well, they’re still supported (analysis doesn’t error out), but fair point. I’ll flag this for the docs team.

 
Thx,
Ann

Hi Ann,

Sorry for the delay (was off some days).

Why? Because we feel that if you’ve got failing tests, the pipeline should stop then & there. You shouldn’t just count the failures & move on. That’s why we’ve slowly but surely been moving away from these metrics.

We could debate about this for a while, but i will try not to.
I just want to say that this statement may not be this straight for everyone (and maybe not us here).
I worked on some project back then with clients asking for features evolution perfectly knowing that it would break algorithm for other cases (letting time to properly fix/change those cases with its users). Keeping those errors help us all to solve this case with debates and design.

And if i may be a bit more incisive following your statement, why keeping errors metrics at all ? If we expect to have only “success” in SQ by stopping pipelines before, there is no need to these metrics :slight_smile:
(i mean “numbers” metrics, counting “errors”, “skipped”, percentage of success, etc)

they’re still supported (analysis doesn’t error out), but fair point. I’ll flag this for the docs team.

Yes, please.
Because i understand that there are steal supported, but there is no mention that they are not used or not visible.
And with no indication, we could naturally expect then to show up. Adding that they were before, so that is a level more confusing.

Best regards,
David

1 Like

Hi,

Thanks for sharing this. It’s the first good argument I remember seeing for allowing a build with failed tests to move forward.

 
:smiley:
Ann

I’ll be honest & say that as an organization we’ve never really believed in those test metrics. Why? Because we feel that if you’ve got failing tests, the pipeline should stop then & there. You shouldn’t just count the failures & move on. That’s why we’ve slowly but surely been moving away from these metrics.

In our case, sometimes the tests only fail in the CI and is very hard to reproduce locally. We want to store the tests result in order to troubleshot them.

1 Like