We’re trying to develop a SonarQube web API integration with some internal tools, and would like to be able to extract quality gate results for a specific Sonar analysis given a specific Git commit SHA that was previously successfully analysed (project key and branch name are known, but the Git SHA is the real key). Searching by the SHA directly is not needed, as long as I could iterate over a couple project analyses until I find a match.
Given that SonarQube can extract Git information (like blame), I was expecting that an analysis could inform the Git SHA of the analysed code, but I couldn’t find such info, not even in internal APIs. Among my web API investigations I found the Compute Engine task scannerContext field, which contains a lot of text info where maybe I could inject Git SHA info for later extraction, but that is a very hacky way to solve things.
So, is this information exposed somehow by SonarQube? If not, is there a way I can store it along with the analysis and retrieve it directly from the web API (that is, without storing the SHA-analysisId mapping somewhere else)?
SonarQube version 7.2.1 (build 14109)
Sonar Maven plugin 126.96.36.1993 (this solution might be applied to non JVM projects in the future)
So, we tried to push this ahead. I added a non-existent sonar.git.commit property to the Sonar Maben plugin execution, and it got saved into the scannerContext (again, I’d like something less hackish). Now I can find a specific task that generated the analysis I want, including a helpful-looking analysisId.
Problem is that I don’t find a way to correlate this task with the analysis data afterwards. For example, if I call
/api/measures/search_history?component=my_project&metrics=quality_gate_details I only get a date, and it seems to be the date when the client executed the Sonar plugin (highly susceptible to problems by drifting client machine clocks). Not being one of the exact dates I could extract from the task (submission, started or executed) makes it very weak as a key, and looking for the “closest date” will sooner or later lead to wrong data, due to concurrent builds, and maybe other scenarios.
So, any suggestion on how to pinpoint the correct analysis data from its corresponding CE task?
I’m not sure I understand the full ins-and-outs of what you’re trying to achieve (especially the why would you like to walk back in time on Quality Gate results), however I feel like this may be helpful: https://docs.sonarqube.org/display/SONAR/Webhooks (notably the section: How can I provide additional parameters to webhooks?)
Lets you pass additional properties, and have them conveyed all the way down to webhooks, which are the go-to solution for notifying external components of Quality Gate breakage.
Thanks for your reply. My reason to (potentially) walk back on the quality gate history is that we’re aggregating several details about a specific deployed version, and that includes Sonar data. Of course the deployed version can be older than the latest code analysed by Sonar, hence the reason to walk back and find the correct analysis data and quality gate status.
I did not know about the properties that can be passed to the webhook, but so far I had ignored that API. This is mostly because our service is not constantly running. It is stateless, passive, executed on-demand, it won’t be running to receive webhook calls, because then it would need to have storage and be a constantly running service (and if it goes down and loses calls, it would be unable to find data for the analysis that happened during the outage). I see this as unnecessary complexity, since all the data is inside Sonar, and I just need a few data pieces to complete the puzzle.
sonar.analysis.* properties are indeed treated differently, can’t they be retrieved later in some other API to make the connection I need (like
api/measures/search_history) or maybe expanding
Without digging too much into the “why” of what you’re doing, you could use a Custom Measure that you update using the POST api/custom_measures/update REST API during a build (it would be important to do so before the analysis kicks off, . Custom Measures are not my favorite, but could work if you need to store this metadata. That way you can tie a Git SHA (which I assume is available to extract in your build process).
You could also store the Git SHA as the version of your analysis, but that would probably make defining leak periods not very fun.
Using the version might work for my project, since we don’t set version numbers at build time, having always “snapshot” version string there. However some of our projects use real versions in Sonar, so it would break for them.
If I understand correctly, a custom measure is a global value, but if updated before each analysis, then each analysis will have a snapshot of the value at analysis time. I just think it would be difficult with our environment because of concurrent builds of many different projects and branches, which happen a lot… at most we would need to create metrics like
myProject_myBranch_gitSha, which would over time grow to be thousands of unused properties of long gone branches. It is an ugly solution, yet viable.
Thanks Colin, it’s good to know all the available paths. Any other way that I should be aware of?
A custom measure exists at the project level, although what custom measures are available to a project are defined at the global level. I’m actually not sure how custom measures work with branches – I doubt the concept exists.
Honestly, I like webhook idea – setting up a API on something like NodeJS that accepts webhooks and then does something like store the contents in a databse is not… too complicated, and you might find it being useful for multiple use-cases.
Indeed, it is not complicated, I just wanted to avoid creating a service just for that (for now), since it is just a matter of the desired information being presented in a “linkable” manner, because Sonar have it all already. In a way, such DB would be a small subset of Sonar DB, just because I can’t do the linking after the webhook fired and the API do not expose more data.
Thanks Collin, I suppose webhooks will be the way.