We use SonarQube with Azure DevOps for our build pipelines (hosted), and I would like to implement pipeline caching for the Sonar analyzers to cut about 7 minutes off my build time (from having to download the analyzers on every build).
I’m having difficulty figuring out how to download the file via the REST API (authentication issue) because I don’t want to use a token tied to an individual user, and I can’t tell how to piggyback the managed service authentication.
What would work for me is if the installed_plugins JSON could be downloaded to a file in the .sonar/conf folder during the preparation step, then I could use that file (hash) as part of the cache key.
I can download the server version now, as it doesn’t require authentication, and that gets me partway there, but I’d really like to include all the analyzer versions in the cache key, as well.
I’m note sure what you are trying to achieve here. The SonarQube scanners already relies on (by default) the
~HOME/.sonar folder to not download plugins over and over. So if you configure your pipeline to cache this
~HOME/.sonar folder, this will save the 7 minutes from your build time.
I’m trying to ensure that whenever the analyzers are updated, it causes a cache miss in the DevOps caching component.
This is a built-in feature of the scanners, so you do not have to worry and handle it on your own
Yes, I do. Because it takes the analyzer 6+ minutes just to download the analyzers. This is the whole reason I’m looking to set up caching.
I use hosted Azure pipelines, which are containers. SonarQube’s cache doesn’t survive from one run to the next because the container is destroyed after each run. This is the whole reason Azure Build Pipelines has a caching component. The pipeline caches are in persistent storage, and the caching component copies the cached files back to the new container.