We’ve got the same issue since yesterday on bitbucket.
`10:06:27.672 DEBUG: Found file: /opt/atlassian/pipelines/agent/build/modules/ps_facetedsearch/package.json`
...
`10:06:29.318 INFO: Analyzed 0 file(s) with current program`
30482
`10:06:29.322 INFO: Creating TypeScript program`
30483
`10:06:29.322 INFO: TypeScript configuration file /opt/atlassian/pipelines/agent/build/tests/UI/tsconfig.json`
30484
`10:06:40.030 INFO: 0/49 files analyzed, current file: /opt/atlassian/pipelines/agent/build/themes/reformam/_dev/js/components/block-cart.js`
30485
`10:06:50.252 INFO: 0/49 files analyzed, current file: /opt/atlassian/pipelines/agent/build/themes/reformam/_dev/js/components/block-cart.js`
30486
`10:06:51.745 DEBUG: The worker thread exited with code 1`
30487
`10:07:00.766 INFO: 0/49 files analyzed, current file: /opt/atlassian/pipelines/agent/build/themes/reformam/_dev/js/components/block-cart.js`
30488
`10:07:10.766 INFO: 0/49 files analyzed, current file: /opt/atlassian/pipelines/agent/build/themes/reformam/_dev/js/components/block-cart.js`
...
`10:11:20.781 INFO: 0/49 files analyzed, current file: /opt/atlassian/pipelines/agent/build/themes/reformam/_dev/js/components/block-cart.js`
30514
`10:11:29.329 ERROR: Failure during analysis`
30515
`java.lang.IllegalStateException: The bridge server is unresponsive`
30516
` at org.sonar.plugins.javascript.bridge.BridgeServerImpl.request(BridgeServerImpl.java:403)`
30517
` at org.sonar.plugins.javascript.bridge.BridgeServerImpl.createProgram(BridgeServerImpl.java:490)`
30518
` at org.sonar.plugins.javascript.bridge.AnalysisWithProgram.analyzeFiles(AnalysisWithProgram.java:79)`
30519
` at org.sonar.plugins.javascript.bridge.JsTsSensor.analyzeFiles(JsTsSensor.java:132)`
30520
` at org.sonar.plugins.javascript.bridge.AbstractBridgeSensor.execute(AbstractBridgeSensor.java:79)`
30521
` at org.sonar.scanner.sensor.AbstractSensorWrapper.analyse(AbstractSensorWrapper.java:62)`
30522
` at org.sonar.scanner.sensor.ModuleSensorsExecutor.execute(ModuleSensorsExecutor.java:75)`
30523
` at org.sonar.scanner.sensor.ModuleSensorsExecutor.lambda$execute$1(ModuleSensorsExecutor.java:48)`
30524
` at org.sonar.scanner.sensor.ModuleSensorsExecutor.withModuleStrategy(ModuleSensorsExecutor.java:66)`
30525
` at org.sonar.scanner.sensor.ModuleSensorsExecutor.execute(ModuleSensorsExecutor.java:48)`
30526
` at org.sonar.scanner.scan.ModuleScanContainer.doAfterStart(ModuleScanContainer.java:64)`
30527
` at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)`
30528
` at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)`
30529
` at org.sonar.scanner.scan.ProjectScanContainer.scan(ProjectScanContainer.java:192)`
30530
` at org.sonar.scanner.scan.ProjectScanContainer.scanRecursively(ProjectScanContainer.java:188)`
30531
` at org.sonar.scanner.scan.ProjectScanContainer.doAfterStart(ProjectScanContainer.java:159)`
30532
` at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)`
30533
` at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)`
30534
` at org.sonar.scanner.bootstrap.ScannerContainer.doAfterStart(ScannerContainer.java:397)`
30535
` at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)`
30536
` at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)`
30537
` at org.sonar.scanner.bootstrap.GlobalContainer.doAfterStart(GlobalContainer.java:125)`
30538
` at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)`
30539
` at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)`
30540
` at org.sonar.batch.bootstrapper.Batch.doExecute(Batch.java:57)`
30541
` at org.sonar.batch.bootstrapper.Batch.execute(Batch.java:51)`
30542
` at org.sonarsource.scanner.api.internal.batch.BatchIsolatedLauncher.execute(BatchIsolatedLauncher.java:46)`
30543
` at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)`
30544
` at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)`
30545
` at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)`
30546
` at java.base/java.lang.reflect.Method.invoke(Method.java:568)`
30547
` at org.sonarsource.scanner.api.internal.IsolatedLauncherProxy.invoke(IsolatedLauncherProxy.java:60)`
30548
` at jdk.proxy1/jdk.proxy1.$Proxy0.execute(Unknown Source)`
30549
` at org.sonarsource.scanner.api.EmbeddedScanner.doExecute(EmbeddedScanner.java:189)`
30550
` at org.sonarsource.scanner.api.EmbeddedScanner.execute(EmbeddedScanner.java:138)`
30551
` at org.sonarsource.scanner.cli.Main.execute(Main.java:126)`
30552
` at org.sonarsource.scanner.cli.Main.execute(Main.java:81)`
30553
` at org.sonarsource.scanner.cli.Main.main(Main.java:62)`
30554
`Caused by: java.net.http.HttpTimeoutException: request timed out`
30555
` at java.net.http/jdk.internal.net.http.HttpClientImpl.send(HttpClientImpl.java:571)`
30556
` at java.net.http/jdk.internal.net.http.HttpClientFacade.send(HttpClientFacade.java:123)`
30557
` at org.sonar.plugins.javascript.bridge.BridgeServerImpl.request(BridgeServerImpl.java:398)`
30558
` ... 37 common frames omitted`
30559
``
30560
`10:11:29.330 DEBUG: The bridge server shut down`
Im on jenkins pipeline and im not using any node.js, you mean shall need to increase the memory of my instances like where my jenkins is runing im using free tier aws account for my client using 1gB 1vcpu
I resume today as i was not well I will try to increase sonar memory im trying figure it out how I can do
Our team did not get reliable pipelines passing even with the increase in docker memory, especially with pipelines running in parallel. Creating a custom docker service to reserve memory for sonarcloud-scan was the only way we got consistent passing pipelines. With this solution, we did not need to increase our default docker service for other steps.
Follow the pipeline log for sonar step. Could someone help me? It’s a project in React + TypeScript
The problem start happen after I change for 2.0 version of sonarScan. If you need the document with debug of pipeline is here: pipelineLog-{2a966f82-0248-450e-80d3-3e92abdee32e}.txt (12.2 MB)
Hi, my team are also experiencing the same issues. Our project is with Vue / TypeScript.
Increasing docker memory from 3072 to 4096 gives me the same error like the above, and the pipelines don’t run.
I’ve also tried the proposed solutions but they didn’t help:
IS there an internal reference/bug that we can track to see when this issue is resolved?
Increasing(doubling) the allocated memory in bitbucket for the step seems a workaround and not a fix, Not to mention it burns thought the bitbucket bill…
We have identified the issue for Vue projects, and we will disable type-checking for them. The next release will solve this issue for Vue, expected in production in ~10 days. If you are facing this issue on non-Vue projects, we have not yet identified any major issue, aside from an expected increase of memory from Typescript itself.
I was facing the same issue while trying to run this over bitbucket pipelines, only for the project in JS/TS. I tried to run it locally and it worked. My workaround to get this working on bitbucket pipelines was stop using pipe and used sonnar-scanner-cli image instead:
The sonarsource/sonar-scanner-cli:5 image doesn’t solve it for me.
Still getting the same java.lang.IllegalStateException: The bridge server is unresponsive error
@Colin@victor.diez we are also facing same issue suddenly we are running sonar scanner through gitlab pipeline, will increasing the memory as suggested above will fix the issue ?