SonarCloud/ADO Disconnect, unreviewed hotspots not failing QG, no reporting of hotspots in ADO

I’ve been analysing some issues we’ve had with quality gates and hotspots in Azure DevOps and carefully went through what I see in SonarCloud vs what I see in ADO. I’ve collected these into screen shots and will attach these here, with commentary.

Theres clearly something going wrong, something has broken down. In summary;
SonarCloud shows a project with a failed Quality Gate. The main thing that bothers us is the unreviewed hotspot; the QG is set to fail on anything but 100% of hotspots being reviewed.
I can see that the pull request has triggered the pipeline which ran SonarCloud. But this run of SonarCloud did not find the issues that it reported previously and the pull request had no problem apart from coverage, which we already know is going to fail (I have a new QG which doesn’t fail on coverage).

Sorry about how the attachments come through, it was originally a power point presentation which I exported to a pdf, but I couldn’t attach a pdf to this, so I had to convert it to a png, which came through a bit untidy.

I’m going to guess that there is something wrong with the way we have set up the ADO pipeline, please let me know what other screen shots I might need to add.

Hi, I could only see the beginning of your image, but I’ll try to summarize your problem to see if I understand correctly:

  1. You have a main branch with an unreviewed hotspot and a failing Quality Gate
  2. You created a PR and the check shows no security hotspot

If that is indeed what you are experiencing, it is probably normal. SonarCloud concentrates on analyzing “new code” to show you if you introduce new issues in your code.
in the context of the PR, the code containing the security hotspot was probably untouched, so SonarCloud did not consider it. It would probably be bad practice for a PR to solve such an issue when working on a totally separate part of the code.
It is still in the main branch and will still be shown as a problem on the branch’s quality gate.

If your review determines it is a real security issue, I would suggest a dedicated PR to solve the problem.

Please let me know if that makes sense or if I missed something.


So it won’t fail a quality gate if old problems are still present? It only fails based on new stuff?
Ie if you have ignored these in the past and done nothing about them, as far as SonarCloud is concerned, thats just fine?

I think that whats happened here is that when SonarCloud was introduced, it scanned the code, found problems, no one knew what they were looking for, and carried on working. I’ve come in somewhat later and been analysing the data thats currently in SonarCloud to find the actual current status of the projects. And its pretty ugly.

Imagine the expectation was that, when SonarCloud finds a problem in code it would always show informationally in the PR check until its fixed…

So, moving forward, we have to get our devs to go over every problem that SonarCloud has in its ‘backlog’ and work through them, eg code smells, hotspots, vulnerabilities etc? And that would involve clicking through each project in the web UI?

And then any new issues should show in the PR where its getting shoved in the devs faces so they actually notice it in the course of their development work?