Fail to download scanner-enterprise-

Must-share information (formatted with Markdown):

  • which versions are you using: 10.2.1
  • how is SonarQube deployed: zip

We have a lot of projects in different azure devops organizations. In 2 of these we have massive issues with running SonarQube in the pipelines, and we are out of ideas what the reason could be. One of these 2 is the worst where 2/3 of builds fail, the other has issues from time to time only. All other projects have no issues at all. We do not find any difference in the projects that could cause it. The Pipelines run in Azure Devops, paid MS hosted Agents. The pipelines sometimes run, sometimes not. It’s not consistent and we see no pattern regarding specific agents working or not, its just random.

The error is, that SonarQube (no matter if run within a docker container in the pipeline or via Azure Extension Task) fails to download scanner-enterprise-

2024-01-18T05:46:09.2108498Z ##[error]ERROR: Error during SonarScanner execution
2024-01-18T05:46:09.2109189Z ERROR: Error during SonarScanner execution
2024-01-18T05:46:09.2121496Z ##[error]org.sonarsource.scanner.api.internal.ScannerException: Unable to execute SonarScanner analysis
2024-01-18T05:46:09.2122124Z org.sonarsource.scanner.api.internal.ScannerException: Unable to execute SonarScanner analysis
2024-01-18T05:46:09.2139457Z ##[error]at org.sonarsource.scanner.api.internal.IsolatedLauncherFactory.lambda$createLauncher$0(
2024-01-18T05:46:09.2140223Z 	at org.sonarsource.scanner.api.internal.IsolatedLauncherFactory.lambda$createLauncher$0(
2024-01-18T05:46:09.2147377Z ##[error]at java.base/ Method)
2024-01-18T05:46:09.2147957Z 	at java.base/ Method)
2024-01-18T05:46:09.2176230Z ##[error]at org.sonarsource.scanner.api.internal.IsolatedLauncherFactory.createLauncher(
2024-01-18T05:46:09.2177064Z 	at org.sonarsource.scanner.api.internal.IsolatedLauncherFactory.createLauncher(
2024-01-18T05:46:09.2178146Z ##[error]at org.sonarsource.scanner.api.internal.IsolatedLauncherFactory.createLauncher(
	at org.sonarsource.scanner.api.EmbeddedScanner.doStart(
2024-01-18T05:46:09.2179016Z 	at org.sonarsource.scanner.api.internal.IsolatedLauncherFactory.createLauncher(
2024-01-18T05:46:09.2179434Z 	at org.sonarsource.scanner.api.EmbeddedScanner.doStart(
2024-01-18T05:46:09.2181358Z ##[error]at org.sonarsource.scanner.api.EmbeddedScanner.start(
	at org.sonarsource.scanner.cli.Main.execute(
	at org.sonarsource.scanner.cli.Main.main(
Caused by: java.lang.IllegalStateException: Fail to download scanner-enterprise- to /home/AzDevOps/.sonar/cache/_tmp/fileCache8001103573241592312.tmp
	at org.sonarsource.scanner.api.internal.cache.FileCache.get(
2024-01-18T05:46:09.2237597Z 	at org.sonarsource.scanner.api.EmbeddedScanner.start(
2024-01-18T05:46:09.2237946Z 	at org.sonarsource.scanner.cli.Main.execute(
2024-01-18T05:46:09.2238248Z 	at org.sonarsource.scanner.cli.Main.main(
2024-01-18T05:46:09.2238924Z Caused by: java.lang.IllegalStateException: Fail to download scanner-enterprise- to /home/AzDevOps/.sonar/cache/_tmp/fileCache8001103573241592312.tmp
2024-01-18T05:46:09.2239360Z 	at
2024-01-18T05:46:09.2239720Z 	at org.sonarsource.scanner.api.internal.cache.FileCache.get(
2024-01-18T05:46:09.2242349Z ##[error]at org.sonarsource.scanner.api.internal.JarDownloader.lambda$getScannerEngineFiles$0(
	at java.base/$3$1.accept(
	at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(
	at java.base/
	at java.base/
	at java.base/$ReduceOp.evaluateSequential(
	at java.base/
	at java.base/
	at org.sonarsource.scanner.api.internal.JarDownloader.getScannerEngineFiles(
2024-01-18T05:46:09.2244193Z 	at org.sonarsource.scanner.api.internal.JarDownloader.lambda$getScannerEngineFiles$0(
2024-01-18T05:46:09.2244603Z 	at java.base/$3$1.accept(
2024-01-18T05:46:09.2244976Z 	at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(
2024-01-18T05:46:09.2245337Z 	at java.base/
2024-01-18T05:46:09.2245692Z 	at java.base/
2024-01-18T05:46:09.2246072Z 	at java.base/$ReduceOp.evaluateSequential(
2024-01-18T05:46:09.2246428Z 	at java.base/
2024-01-18T05:46:09.2246792Z 	at java.base/
2024-01-18T05:46:09.2247168Z 	at org.sonarsource.scanner.api.internal.JarDownloader.getScannerEngineFiles(
2024-01-18T05:46:09.2247521Z 	at
2024-01-18T05:46:09.2248334Z ##[error]at org.sonarsource.scanner.api.internal.IsolatedLauncherFactory.lambda$createLauncher$0(
	... 7 more
2024-01-18T05:46:09.2249137Z 	at org.sonarsource.scanner.api.internal.IsolatedLauncherFactory.lambda$createLauncher$0(
2024-01-18T05:46:09.2249480Z 	... 7 more
2024-01-18T05:46:09.2250758Z ##[error]Caused by: timeout
	at org.sonarsource.scanner.api.internal.shaded.okio.Okio$4.newTimeoutException(
	at org.sonarsource.scanner.api.internal.shaded.okio.AsyncTimeout.exit(
	at org.sonarsource.scanner.api.internal.shaded.okio.AsyncTimeout$
2024-01-18T05:46:09.2251854Z Caused by: timeout
2024-01-18T05:46:09.2252203Z 	at org.sonarsource.scanner.api.internal.shaded.okio.Okio$4.newTimeoutException(
2024-01-18T05:46:09.2252614Z 	at org.sonarsource.scanner.api.internal.shaded.okio.AsyncTimeout.exit(
2024-01-18T05:46:09.2253037Z 	at org.sonarsource.scanner.api.internal.shaded.okio.AsyncTimeout$
2024-01-18T05:46:09.2255220Z ##[error]at
	at org.sonarsource.scanner.api.internal.shaded.okhttp.internal.http1.Http1ExchangeCodec$
	at org.sonarsource.scanner.api.internal.shaded.okhttp.internal.http1.Http1ExchangeCodec$
	at org.sonarsource.scanner.api.internal.shaded.okhttp.internal.connection.Exchange$
	at org.sonarsource.scanner.api.internal.shaded.okio.RealBufferedSource$
2024-01-18T05:46:09.2256662Z 	at
2024-01-18T05:46:09.2257160Z 	at org.sonarsource.scanner.api.internal.shaded.okhttp.internal.http1.Http1ExchangeCodec$
2024-01-18T05:46:09.2257699Z 	at org.sonarsource.scanner.api.internal.shaded.okhttp.internal.http1.Http1ExchangeCodec$
2024-01-18T05:46:09.2258221Z 	at org.sonarsource.scanner.api.internal.shaded.okhttp.internal.connection.Exchange$
2024-01-18T05:46:09.2258701Z 	at org.sonarsource.scanner.api.internal.shaded.okio.RealBufferedSource$
2024-01-18T05:46:09.2260303Z ##[error]at java.base/
	at java.base/java.nio.file.Files.copy(
	at org.sonarsource.scanner.api.internal.ServerConnection.downloadFile(
	at org.sonarsource.scanner.api.internal.JarDownloader$
	... 19 more
2024-01-18T05:46:09.2261462Z 	at java.base/
2024-01-18T05:46:09.2261779Z 	at java.base/java.nio.file.Files.copy(
2024-01-18T05:46:09.2262143Z 	at org.sonarsource.scanner.api.internal.ServerConnection.downloadFile(
2024-01-18T05:46:09.2262586Z 	at org.sonarsource.scanner.api.internal.JarDownloader$
2024-01-18T05:46:09.2263004Z 	at
2024-01-18T05:46:09.2263287Z 	... 19 more
2024-01-18T05:46:09.2263944Z ##[error]Caused by: Socket closed
	at java.base/
2024-01-18T05:46:09.2264584Z Caused by: Socket closed
2024-01-18T05:46:09.2264891Z 	at java.base/
2024-01-18T05:46:09.2266501Z ##[error]at java.base/
	at java.base/
	at java.base/
	at java.base/
	at java.base/
2024-01-18T05:46:09.2267687Z 	at java.base/
2024-01-18T05:46:09.2268050Z 	at java.base/
2024-01-18T05:46:09.2268453Z 	at java.base/
2024-01-18T05:46:09.2268938Z 	at java.base/
2024-01-18T05:46:09.2269354Z 	at java.base/
2024-01-18T05:46:09.2270552Z ##[error]at java.base/$
	at org.sonarsource.scanner.api.internal.shaded.okio.Okio$
	at org.sonarsource.scanner.api.internal.shaded.okio.AsyncTimeout$
	... 29 more
2024-01-18T05:46:09.2271477Z 	at java.base/$
2024-01-18T05:46:09.2271862Z 	at org.sonarsource.scanner.api.internal.shaded.okio.Okio$
2024-01-18T05:46:09.2272257Z 	at org.sonarsource.scanner.api.internal.shaded.okio.AsyncTimeout$
2024-01-18T05:46:09.2272559Z 	... 29 more
2024-01-18T05:46:09.2272924Z ##[error]ERROR:
2024-01-18T05:46:09.2273357Z ERROR: 
2024-01-18T05:46:09.2368787Z ##[error]The SonarScanner did not complete successfully
2024-01-18T05:46:09.2369374Z The SonarScanner did not complete successfully
2024-01-18T05:46:09.2369945Z ##[error]05:46:09.236  Post-processing failed. Exit code: 1
2024-01-18T05:46:09.2370635Z 05:46:09.236  Post-processing failed. Exit code: 1
2024-01-18T05:46:09.2428526Z ##[error]The process '/opt/hostedtoolcache/dotnet/dotnet' failed with exit code 1
2024-01-18T05:46:09.2452841Z ##[section]Finishing: SonarQube Analyze

The error sounds like a network failure. However, SonarQube server logs show a Status 200 for this request, all other projects (also in Azure DevOps, also many others using MS hosted agents) have no issues downloading this file from the same sonarqube server instance. Our IT is clueless what the reason could be as they do not have any limitation here that could trigger, neither find anything suspicious in their logs.

Is there any further ideas what the reason could be? Do others have the same issue with Azure DevOps? Is there any known workaround for such issue?



Points for already having check the access log. That’s helpful!

Two things occur to me

  • is there something “helpful” in the middle (e.g. a proxy) that might be interfering here? Yes, I know the download works for other jobs, but… maybe?
  • Maybe this is a problem with the CI-side cache…? Can you clean out the cache and reset the permissions to make sure the analysis user has full rights?

At a guess, the socket is timing out waiting for the write to be successful.


is there something “helpful” in the middle (e.g. a proxy) that might be interfering here? Yes, I know the download works for other jobs, but… maybe?

I don’t know whats on azure side. We don’t configure anything specific. In front of our SonarQube there is a firewall on our side, but nothing suspicious in its logs.

Maybe this is a problem with the CI-side cache…? Can you clean out the cache and reset the permissions to make sure the analysis user has full rights?

Tried this, but several projects using the same analysis user work without issue.

At a guess, the socket is timing out waiting for the write to be successful.

Correct. We broke this down now further: We set up 4 new pipelines which only do one job: downloa a file via wget echoing all debug output. We let all 4 run in parallel every 15 min over the weekend. 2 of them downloaded the JAR file from SQ server, 2 others downloaded a 1 GB reference file from another server.
The reference pipelines had a success rate of 100%. The SQ jar download pipelines had a success rate of 95% which is already much higher than the regular scans, however still strange difference.
Fails ran into
2024-01-22 02:20:47 (6.49 MB/s) - Read error at byte 3011507 (Connection timed out).Giving up.
wget did an automatic retry on failed downloads which auto-fixed all of them. wget was using a 15 minute timeout, so even increasing the sonarscanner timeout would not help here. Unfortunately it seems the sonarscanner does not have an option to configure auto-retries for failed downloads. Could this be an improvement you can suggest the team maybe for instable network connections?

edit: however, maybe the 15 minute wget timeout is the reason for the higher success rate in the test pipeline vs the real scan pipeline, as SQ is using a 60s timeout. Still strange, since retries worked quickly again.


Were all the failures at the same byte 3011507 point? Approximately there?

And did the SonarQube access log show a 200 for all the requests?

Was the reference file behind the same firewall?