Is it ok to cache to sonar plugins in an Azure DevOps Pipeline?

Hi!

I was wondering if it’s ok to cache the Sonar plugins cache folder in an Azure DevOps pipeline.

variables:
   SONAR_PLUGINS: C:\Users\VssAdministrator\.sonar\cache

- task: Cache@2
       inputs:
         key: sonar | "$(Agent.OS)" | $(Build.Repository.Name)
         path: $(SONAR_PLUGINS)
       displayName: cache sonar plugins

This saves you having to download the plugins every time, which takes between 30 and 40 seconds.
But I’m not sure if I’m always using the cached version of the plugins, or if the SonarCloud plugin updates its cache when it finds newer versions.

Any recommendations here?

Hey there.

(Adapting an internal answer from @Julien_HENRY, thanks Julien!)

The problem is that Azure Pipeline caches are immutable. Since analyzers can be updated on the server, without any change to your project files, there is no easy way to make this feature useful.

With your way to use the Cache task, this is what will happen:

First run of the pipeline:

  • cache restoration is empty
  • sonar-scanner will download plugins: pluginA-1.0.jar, pluginB-1.0.jar and store them in C:\Users\VssAdministrator\.sonar\cache
  • cache will be created for key with content pluginA-1.0.jar, pluginB-1.0.jar)

Second run of the pipeline (no changes on SC side):

  • cache is restored with content pluginA-1.0.jar, pluginB-1.0.jar (cache hit)
  • sonar-scanner will skip download of plugins since they are already in the cache

Third run of the pipeline (pluginA has been updated to version 1.1 on SC):

  • cache is restored with content pluginA-1.0.jar, pluginB-1.0.jar (cache hit)
  • sonar-scanner will skip download of plugin B since it is already in the cache, but will download pluginA-1.1.jar

4th run of the pipeline (pluginB has been updated to version 1.1 on SC):

  • cache is restored with content pluginA-1.0.jar, pluginB-1.0.jar (cache hit)
  • sonar-scanner will download pluginA-1.1.jar and pluginB-1.1.jar

As you can see, after a while, the entire content of the cache will be outdated, and the scanner will download everything from scratch again.

I think there is a complex way to make the cache work properly, but this would requires a lot of setup.
Something like (not tested):

- task: wget https://mySonarQubeServer/batch/index > sonar-scanner-hash.txt
- task: wget https://mySonarQubeServer/api/plugins/installed > sonar-plugins.json
- task: Cache@2
  inputs:
    key: '"sonar-scanner" | sonar-scanner-hash.txt | sonar-plugins.json'
    path: '/home/vsts/.sonar/cache'

With this strategy, the cache key will be dynamic and rely on what is on the server. Every time a plugin will be updated on SQ/SC, the cache will be invalidated, and all plugins will be downloaded from scratch. The good point contrary to other CIs is that the cache will not grow indefinitely.

Thanks for the quick reply, that sounds pretty good. In any case, I’m also reassured that SonarCloud will definitely download newer plugins again.
Feature request: If the two files could simply be downloaded in a future version in the Prepare or Analyze step, then you could save yourself the manual download.

In case anyone stumbles across this post and is looking for a solution to a windows pipeline, here is my customized pipeline:

variables:
  SONAR_PLUGINS: C:\Users\VssAdministrator\.sonar\cache

stages:
- stage: Build
  jobs:
  - job:
    steps:        
    - task: PowerShell@2
      inputs:
        targetType: 'inline'
        script: |
          Invoke-WebRequest https://sonarcloud.io/batch/index -OutFile sonar-scanner-hash.txt
          Invoke-WebRequest https://sonarcloud.io/api/plugins/installed -OutFile sonar-plugins.json
      displayName: Get SonarCloud Plugins List

    - task: Cache@2
      inputs:
        key: '"sonar-scanner" | sonar-scanner-hash.txt | sonar-plugins.json'
        path: $(SONAR_PLUGINS)
      displayName: Cache SonarCloud Plugins
      
    - task: SonarCloudPrepare@1
      displayName: 'Prepare SonarCloud'
	...

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.