Does Sonar scanner have a fixed timeout when downloading plugins?


right now we’ve hit problems related to

Even if running an Enterprise instance we nevertheless need some custom plugins.
We run one central Sonarqube instance, that is accessed from several network areas, means proxies and web gateways (virus scan…) involved, so the network traffic is slowed down.

The sonar analysis fails with a socket timeout, because it fails downloading one plugin - that ain’t even needed for analysis - after 5mins, it’s of the type ‘external’.

I’ve tried with increasing and http.socket.timeout but didn’t work either.

Searched and found a workaround using curl
curl -vvv -u squ_xxxxxxxx: --proxy http://xxx:xxx -o xxx-plugin.jar https://xxx/api/plugins/download?plugin=xxx

The plugin is downloaded in 5min 30 sec and curl uses exactly the same network settings as the
Sonar scanner.

It seems the Sonar scanner doesn’t respect an increased value !?



Hi Gilbert,

Remind me: you’re on a current version, right?


Hi Ann,

Sonarqube production 9.9 LTS, Sonarqube test 10.1
Sonar CLI scanner (latest)

But in older versions (Sonarqube, Sonar CLI scanner) we already had the same problems.
It seems there is a fixed timeout around 5 mins.

I will try to dive into the sources of


but there’s a bunch of other obligations with higher priority right now.
Glad i’ve found a workaround via curl.


1 Like

Hi Gilbert,

Thanks for the confirmation. I’ve flagged this for the team.


1 Like

That’s correct. I created SONAR-19994 to fix that, and allow a custom timeout to be defined for plugin download. Many thanks for the feedback :+1:

1 Like

We run our workloads on ephemeral containers so to reduce the amount of downloads over the Internet we use a cache file (local to our network) to pre-download the plugins onto our containers before calling the sonar runner. I recommend you do something similar so that you are less likely to hit timeout or other similar issues from fetching binaries over the Internet. If the cache is not up to date the runner will then download the correct binaries as needed and the next cache refresh should get the updated ones.

1 Like

Hello Stephane,

can you provide more details about how you cache the plug-ins?

Best regards

Hi @jan.schatz,

The actually implementation will really depend on your CI/CD infrastructure.

The gist of it is:

  • have a pipeline that runs on regular basis (once a week is fine). Can be re-run on demand.
  • run the sonar runner with some code in the languages you want to support, or against your main repository.
  • create an archive from the cache directory.
  • push the archive to a shared location.

To re-use the cache:

  • before calling the sonar runner.
  • pull the most recent cache from the shared location.
  • extract the cache in the home directory.
  • run the sonar runner.

On the pipelines that call sonar, before we do call sonar we fetch the tgz and extract it into the sonar cache directory. When the sonar runner starts it detects the cached plugins, and in most case does not need to download them. By default the cache is under "${HOME}/.sonar/".

Since we run sonar for most of our repositories and we run pipelines through Jenkins under EKS in AWS we ended up creating a custom Jenkins library so that our sonar stages only need to declare installSonarPluginsCache() before calling our custom sh ''. The cache is generated from a custom sonar_cache pipeline that runs once a week and takes under 2m to complete.

For compressing we use zstandard, which ends up being much smaller than a regular tgz. We use tar --exclude=*.jar_unzip -czvf sonar_cache.tgz -C "${HOME}/.sonar" cache to create the file and `tar --zstd -C “${HOME}/.sonar” -xf ‘${cacheArchive}’ to decompress it.

You can see if the cache is working by looking at the output of the runner.

Without the cache, downloading from the Internet took 20s:

INFO: User cache: /home/ci/.sonar/cache
INFO: Load/download plugins
INFO: Load plugins index
INFO: Load plugins index (done) | time=660ms
INFO: Load/download plugins (done) | time=20093ms

With the cache:

INFO: User cache: /home/ci/.sonar/cache
INFO: Load/download plugins
INFO: Load plugins index
INFO: Load plugins index (done) | time=236ms
INFO: Load/download plugins (done) | time=512ms

In our case pulling the cache and extracting it takes 5-6s so we save 15s per pipeline on average. The cache folder is 291MB and the compressed archive is 251MB, the plugins are already gzip compressed but zstandard is able to compress a bit more.

1 Like

Hello @sodul,

thank you for the very detailed description.
When you run sonar-scanner on some random or real-life code, this will create a new entry on the server as well. Did you find a way to circumvent this?


You are correct, we actually run it with no project configured. The runner fetches all the plugins before checking if the project exists:

INFO: ------------------------------------------------------------------------
INFO: ------------------------------------------------------------------------
INFO: Total time: 30.257s
INFO: Final Memory: 20M/53M
INFO: ------------------------------------------------------------------------
ERROR: Error during SonarScanner execution
ERROR: Project not found. Please check the 'sonar.projectKey' and 'sonar.organization' properties, the 'SONAR_TOKEN' environment variable, or contact the project administrator
ERROR: Re-run SonarScanner using the -X switch to enable full debug logging.

We pretty much run it with no project configured and then ignore the non zero exit code then proceed with archiving the freshly populated cache.

1 Like

So you came up with the same dirty workaround. :smile:

You can “fix” the exit code by grepping for the expected error message like this:

sonar-scanner -X | tee >(grep "You must define the following mandatory properties for 'Unknown': sonar.projectKey")

The tee makes it so we can still see the output of the command while piping it to grep.

@ganncamp, it would be really nice, if sonar-scanner had a command line option that just made it stop after downloading stuff.


As a followup, I learned yesterday that we’re planning to start work “soon” on limiting download to only those plugins that are needed for the current analysis. The work will be done first on the SonarCloud side and then, at some point, ported over to SonarQube.

I have no dates or versions for you, so don’t bother asking. :smile:



@ganncamp limiting to only the needed plugins would be good, on the other hand, this mean that our current caching trick will break (we analyze nothing) and we will need to change how we generate the cache. Would there be a way to tell the sonar runner to only download caches for a specific set of languages.

Something like --download-plugins=python,go,terraform,xml,nodejs. This would allows us to more easily cache the plugins, but also have different sets of caches for our microrepos that use less plugins.

Hi @sodul,

No work has started yet, but the current plan includes a flag to disable the selectivity & download everything.


Hi again @sodul,

I’ve just talked to the guy who’s writing the spec. We’re not going to make special provision for this, but he believes that once the changes we’re contemplating are in place, this will be scriptable.


1 Like

I what to think you all for being with me trying to get things together I really appreciate I pray for a long life for you all