Is it possible to bootstrap profiles and quality gates?

docker

(Timothy Harris ) #1

I am building a SonarQube docker image based on SonarQube version 7.1. I want to bake as much as possible into the image. Presently what I want to bake into the image is:

  • Plugins
  • AD connection over LDAP to retrieve groups and users.
  • Quality Profiles
  • Quality Gates

So far I have been able to:

  • Bootstrap the plugins by creating my own docker file based on sonarqube:7.1 and downloading the plugins to /opt/sonarqube/extensions/plugins on docker build.
  • Setup AD connection by modifying the sonar.properties file and copying it into /opt/sonarqube/conf on docker build.

But I don’t know where to start with quality profiles and gates. Is it even possible?


(Nicolas Bontoux) #2

Hey Timothy,

Well Quality Profiles and Quality Gates, just like any other SonarQube data (aside of sonar.properties config) are stored in database. Which begs this question: are you also baking a pre-populated db of some kind in your Docker image ? Otherwise my understanding would be that anytime you deploy the image then your instance would start from scratch in terms of analysis data. Sounds surprising to me, would be interesting to understand your broader context and objective here.

Ultimately also note that any SonarQube data can be manipulated via WebAPI (documentation embedded in your server), so you could use that to programmatically create Quality Profiles/Gates if needed.


(Timothy Harris ) #3

Hi @NicoB,

I don’t really need the analysis data to be persisted per say. Just the QP’s and QG’s. So no I do not want a pre-populated DB. Basically I just want to be able to spin up new containers with defined plugins, users, groups, QP’s and QG’s.

The actual data from the analysis is a bit dynamic as the analysis are being executed from a Jenkins docker image and pipeline definitions.

stage(‘Begin Static Analysis’) {
steps {
script {
// Todo change to use pre installed MSBuild and use powershell instead.
bat “”${tool ‘SonarScanner.MSBuild.exe’}" begin /d:sonar.login=${SONAR_AUTH_TOKEN} /k:NEON /n:Website /v:Build#${BUILD_NUMBER}-SHA#${PULL_REQUEST_FROM_HASH} /d:sonar.branch=${PULL_REQUEST_TO_BRANCH} /d:sonar.host.url=${SONAR_URL} /d:sonar.cs.nunit.reportsPaths=${WORKSPACE}\NUnitResult.xml /d:sonar.cs.opencover.reportsPaths=${WORKSPACE}\OpenCoverResult.xml"
}
}
}

So the overall use case looks like this.

Team A has a set of Jenkins pipelines that execute a static analysis and use QP-A and QG-A. These are checked in to git (Bitbucket) and executed every time Team A has a change.

Team B Also has it’s own set of pipelines, QP-B and QG-B.

New Team C needs a different QP and QG. I update the image, build it and re-deploy.

So basically it is controlling what Quality Gates/Profiles per language as code.

But this does not look possible. I guess I could manage this through the API.


(Colin Mueller) #4

Timothy,

What’s the disadvantage of hosting a persistent SonarQube server where these multiple quality profiles and quality gates exist based on the needs of the various teams (one of SonarQube’s most basic features?)

I also imagine the availability of specific tokens would be extremely difficult to achieve with this kind of setup.

Colin


(Timothy Harris ) #5

@ColinHMueller,

I am not quite sure I understand what you mean?

Basically, I am a DevOps guys. My focus is on Continuous Integration/Continuous Delivery(CI/CD) and “everything” as code. Including the systems needed to support CI/CD.

I do not like nor do I want systems where I cannot reproduce the configuration needed to support its purpose in an automated manner.

Handling the QG and QP through the UI means, if I have to reproduce the system, I have to document the manual configuration of these and then do them after spinning the new system up.

That is why I want quality profiles and gates baked into the image.

I would prefer stuff like this as part of the sonar.properties file as I see it as configuration. I would keep the database containing only data and not configuration.

So, in my opinion, pretty much everything in the “Configuration” menu should be part of sonar.properties file along with QP’s and QG’s.

But, as SonarQube stores these in the database. There is no, reasonable, way for me to do this.

So it is a moot point I guess.


(Colin Mueller) #6

Timothy,

All of what you mention is possible (storing your configuration as JSON representations of your quality profiles and quality gates and then using Web APIs and insert-your-favorite-scripting-language-here to deploy the configuration). In fact, that’s how I deployed changes to SonarQube servers long ago.

It was great when I was spinning up a new instance to test upgrades (or just messing around in a new staging environment) and wanted an instance that looked exactly like the other ones (without the analysis data or users of course) when a database clone would have taken too much time.

However, I’m a little confused by the purpose of spinning up a new SonarQube server absent analysis history whenever the desired configuration changes, rather than maintain one persistent instance with multiple quality profiles and quality gates that multiple teams can take advantage of at once.

That’s what I was getting at in my last response.

Cheers,

Colin


(Nicolas Bontoux) #7

Well, the standard approach that we’ve seen adopted is for users/customers to take backups of database, and restore their install shall anything go wrong with the server (rare thing though).

To complement @ColinHMueller’s point though, it’s important to understand that SonarQube is not a volatile/fire-and-forget tool. It’s a server application that you would run constantly, and which would track the quality of your project over time (which can literally be over years). Just like a Git server for example, which you wouldn’t redeploy from scratch on a case-by-case need. No, you would have a constantly operating server that fits within your CI infrastructure.

And I’m not just saying this in terms of operations practices, the entire SonarQube feature-set is structured around tracking the codebase over time, with consistent results along the way (ensuring clean code since last release, decorating PRs based on configuration as well as past triage, tracking issues across code refactoring along with any dev comment etc.).