Is it possible to bootstrap profiles and quality gates?

I am building a SonarQube docker image based on SonarQube version 7.1. I want to bake as much as possible into the image. Presently what I want to bake into the image is:

  • Plugins
  • AD connection over LDAP to retrieve groups and users.
  • Quality Profiles
  • Quality Gates

So far I have been able to:

  • Bootstrap the plugins by creating my own docker file based on sonarqube:7.1 and downloading the plugins to /opt/sonarqube/extensions/plugins on docker build.
  • Setup AD connection by modifying the sonar.properties file and copying it into /opt/sonarqube/conf on docker build.

But I don’t know where to start with quality profiles and gates. Is it even possible?

Hey Timothy,

Well Quality Profiles and Quality Gates, just like any other SonarQube data (aside of sonar.properties config) are stored in database. Which begs this question: are you also baking a pre-populated db of some kind in your Docker image ? Otherwise my understanding would be that anytime you deploy the image then your instance would start from scratch in terms of analysis data. Sounds surprising to me, would be interesting to understand your broader context and objective here.

Ultimately also note that any SonarQube data can be manipulated via WebAPI (documentation embedded in your server), so you could use that to programmatically create Quality Profiles/Gates if needed.

Hi @NicoB,

I don’t really need the analysis data to be persisted per say. Just the QP’s and QG’s. So no I do not want a pre-populated DB. Basically I just want to be able to spin up new containers with defined plugins, users, groups, QP’s and QG’s.

The actual data from the analysis is a bit dynamic as the analysis are being executed from a Jenkins docker image and pipeline definitions.

stage(‘Begin Static Analysis’) {
steps {
script {
// Todo change to use pre installed MSBuild and use powershell instead.
bat “"${tool ‘SonarScanner.MSBuild.exe’}" begin /d:sonar.login=${SONAR_AUTH_TOKEN} /k:NEON /n:Website /v:Build#${BUILD_NUMBER}-SHA#${PULL_REQUEST_FROM_HASH} /d:sonar.branch=${PULL_REQUEST_TO_BRANCH} /d:sonar.host.url=${SONAR_URL} /d:sonar.cs.nunit.reportsPaths=${WORKSPACE}\NUnitResult.xml /d:sonar.cs.opencover.reportsPaths=${WORKSPACE}\OpenCoverResult.xml”
}
}
}

So the overall use case looks like this.

Team A has a set of Jenkins pipelines that execute a static analysis and use QP-A and QG-A. These are checked in to git (Bitbucket) and executed every time Team A has a change.

Team B Also has it’s own set of pipelines, QP-B and QG-B.

New Team C needs a different QP and QG. I update the image, build it and re-deploy.

So basically it is controlling what Quality Gates/Profiles per language as code.

But this does not look possible. I guess I could manage this through the API.

Timothy,

What’s the disadvantage of hosting a persistent SonarQube server where these multiple quality profiles and quality gates exist based on the needs of the various teams (one of SonarQube’s most basic features?)

I also imagine the availability of specific tokens would be extremely difficult to achieve with this kind of setup.

Colin

1 Like

@Colin,

I am not quite sure I understand what you mean?

Basically, I am a DevOps guys. My focus is on Continuous Integration/Continuous Delivery(CI/CD) and “everything” as code. Including the systems needed to support CI/CD.

I do not like nor do I want systems where I cannot reproduce the configuration needed to support its purpose in an automated manner.

Handling the QG and QP through the UI means, if I have to reproduce the system, I have to document the manual configuration of these and then do them after spinning the new system up.

That is why I want quality profiles and gates baked into the image.

I would prefer stuff like this as part of the sonar.properties file as I see it as configuration. I would keep the database containing only data and not configuration.

So, in my opinion, pretty much everything in the “Configuration” menu should be part of sonar.properties file along with QP’s and QG’s.

But, as SonarQube stores these in the database. There is no, reasonable, way for me to do this.

So it is a moot point I guess.

Timothy,

All of what you mention is possible (storing your configuration as JSON representations of your quality profiles and quality gates and then using Web APIs and insert-your-favorite-scripting-language-here to deploy the configuration). In fact, that’s how I deployed changes to SonarQube servers long ago.

It was great when I was spinning up a new instance to test upgrades (or just messing around in a new staging environment) and wanted an instance that looked exactly like the other ones (without the analysis data or users of course) when a database clone would have taken too much time.

However, I’m a little confused by the purpose of spinning up a new SonarQube server absent analysis history whenever the desired configuration changes, rather than maintain one persistent instance with multiple quality profiles and quality gates that multiple teams can take advantage of at once.

That’s what I was getting at in my last response.

Cheers,

Colin

1 Like

Well, the standard approach that we’ve seen adopted is for users/customers to take backups of database, and restore their install shall anything go wrong with the server (rare thing though).

To complement @Colin’s point though, it’s important to understand that SonarQube is not a volatile/fire-and-forget tool. It’s a server application that you would run constantly, and which would track the quality of your project over time (which can literally be over years). Just like a Git server for example, which you wouldn’t redeploy from scratch on a case-by-case need. No, you would have a constantly operating server that fits within your CI infrastructure.

And I’m not just saying this in terms of operations practices, the entire SonarQube feature-set is structured around tracking the codebase over time, with consistent results along the way (ensuring clean code since last release, decorating PRs based on configuration as well as past triage, tracking issues across code refactoring along with any dev comment etc.).

2 Likes

It is not really about spinning up new instances I don’t guess. It is about recovery scenarios. The sonar analysis is part of a tollgate pipeline our developers changes have to pass through to deliver code to our integration branch. Lots of different repositories with different profiles and quality gates. If I lose the server and database i do not want to spend a day or days recovering by adding them through the UI. They would be blocked.

But thanks for your answer. I think the web api is the way to go.

It certainly is a way to ‘code’ the configuration as you say, giving the ability to programmatically re-instantiate a server with a pre-determined config. That being said, regarding this scenario:

Or about a (good ol’) regular database backup? In which case all you have to do is recover that backup, and relaunch SonarQube (to be accurate, you also need to keep plugins handy, as well as sonar.properties config file). The instance will just boot as it was at backup time, no need to recreate anything through UI.

1 Like

Little late, but also running into this challenge (how to pre-configure quality gates without relying on a database).

I might share a usecase/context that may be helpful – more and more development teams and turning into ‘devops’ where they have to manage their own CI/CD pipeline and infrastructure. And since the team is still responsible to deliver their custom code, the risk/reward decision around certain things are not the same as traditional setups - such as, is it really worth it to setup analysis history if we could just re-scan a codebase to learn its current state, when the cost of maintaining analysis history means setting up your own database/backup/restore/snapshot/replica that in a corporate environment means delays around those processes…or just skip all that and re-run pipeline to get current state.

Other tooling such as your jenkins, security scanning tools, unit tests, integration tests can all be ephemeral (no need for database/state), however sonarqube is introducing some challenges around whether you really need to have a stateful database versus can you get by with the embedded memory database and associated pre-configuration (assuming you don’t need High Availability…which would require the database).

Returning back to the original thread - the pre-configuration of quality gates is a bit problematic, I agree.

I would echo the request that initially started this thread. It would be very cool to be able to run sonarqube under build conditions. For me to up a sonarqube service with quality gates and project properties stored in the github repo of the project would be the ideal. that way we dont rely on a central server and I can stop build pipelines / deployment pipelines due to quality