ERROR Error during SonarScanner Engine execution

Hi there,

Hoping you can please assist. Falling at the first hurdle of SonarCloud setup.

  • ALM used (Bitbucket Cloud)
  • CI system used (Bitbucket Cloud)
  • Scanner command used when applicable (private details masked)
  • Languages of the repository (JS and PHP)
  • Error observed - debug errors, on the merge in bitbucket
  • Steps to reproduce. Trying to push yaml files for initial setup, but can’t merge.
  • Merge error: ERROR Error during SonarScanner Engine execution
'06:50:00.977 INFO  Preprocessing files...06:50:01.251 ERROR Error during SonarScanner Engine executionjava.lang.IllegalStateException: Failed to preprocess files  at 
org.sonar.scanner.scan.filesystem.ProjectFilePreprocessor.processModuleSources(ProjectFilePreprocessor.java:159)  at 
org.sonar.scanner.scan.filesystem.ProjectFilePreprocessor.processModule(ProjectFilePreprocessor.java:135)  at 
org.sonar.scanner.scan.filesystem.ProjectFilePreprocessor.processModulesRecursively(ProjectFilePreprocessor.java:123)  at 
org.sonar.scanner.scan.filesystem.ProjectFilePreprocessor.execute(ProjectFilePreprocessor.java:88)  at 
org.sonar.scanner.bootstrap.ScannerContainer.doAfterStart(ScannerContainer.java:411)  at 
org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)  at 
org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)  at 
org.sonar.scanner.bootstrap.GlobalContainer.doAfterStart(GlobalContainer.java:128)  at 
org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)  at 
org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)  at 
org.sonar.scanner.bootstrap.ScannerMain.runScannerEngine(ScannerMain.java:135)  at org.sonar.scanner.bootstrap.ScannerMain.run(ScannerMain.java:52)  at
 org.sonar.scanner.bootstrap.ScannerMain.main(ScannerMain.java:38)
Caused by: java.nio.file.AccessDeniedException: /opt/atlassian/pipelines/agent/build/storage/framework/testing/disks  at
....
org.sonar.scanner.scan.filesystem.ProjectFilePreprocessor.processDirectory(ProjectFilePreprocessor.java:167)  at 
org.sonar.scanner.scan.filesystem.ProjectFilePreprocessor.processModuleSources(ProjectFilePreprocessor.java:152)  
... 12 common frames omitted"

Let me know if i’m missing anything.

Thanks for any help.

Hey there.

Can you share your Bitbucket Pipelines YML file?

Hi Colin,

Here it is below. I’ve manually added the SonarQube pipes, but it’s not on our actual file since we can’t merge without error.

image: misterio92/ci-php-node:6.0

pipelines:
  default:
    - step:
        name: Build And Test
        services:
          - mysql
        script:
          - composer install
          - npm install
          - npm run dev
          - export APP_ENV=testing DB_PASSWORD=$MYSQL_TEST_PASSWORD DB_USERNAME=root
          - cp .env.example .env
          - php artisan key:generate
          - npm test
          - pipe: sonarsource/sonarcloud-scan:3.0.0
          - pipe: sonarsource/sonarcloud-quality-gate:0.1.6
          - vendor/bin/phpunit --log-junit ./test-reports/junit.xml
        caches:
          - composer-tmp #cache is defined below in the definitions section

  branches:
    master:
      - step:
          name: Run Test
          script:
            - composer install
            - npm install
            - npm run dev
            - export APP_ENV=testing DB_PASSWORD=$MYSQL_TEST_PASSWORD DB_USERNAME=root
            - cp .env.example .env
            - php artisan key:generate
            - npm test
            - vendor/bin/phpunit --log-junit ./test-reports/junit.xml
          caches:
            - composer-tmp #cache is defined below in the definitions section
      - step:
          name: Deploy To Production
          deployment: Production
          script:
            - apt-get update
            - apt-get install -y zip
            - apt-get update && apt-get install -y python3-pip
            - pip3 install awscli
            - apt-get install -y jq
            - composer install
            - npm install
            - cp .env.live .env
            - php artisan key:generate
            - php artisan storage:link
            - php artisan cache:clear
            - php artisan route:clear
            - php artisan config:clear
            - npm run prod
            - rm -R node_modules
            - zip -r gatheroo.zip .
            - pipe: atlassian/aws-code-deploy:0.2.5
              variables:
                AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
                AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                APPLICATION_NAME: $APPLICATION_NAME
                S3_BUCKET: $S3_BUCKET
                COMMAND: 'upload'
                ZIP_FILE: 'gatheroo.zip'
                VERSION_LABEL: 'gatheroo-1.0.0'
            - pipe: atlassian/aws-code-deploy:0.2.5
              variables:
                AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
                AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                APPLICATION_NAME: $APPLICATION_NAME
                DEPLOYMENT_GROUP: $PRODUCTION_DEPLOYMENT_GROUP
                S3_BUCKET: $S3_BUCKET
                COMMAND: 'deploy'
                WAIT: 'true'
                VERSION_LABEL: 'gatheroo-1.0.0'
                IGNORE_APPLICATION_STOP_FAILURES: 'true'
                FILE_EXISTS_BEHAVIOR: 'OVERWRITE'
            - aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID
            - aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY
            - aws configure set default.region $AWS_DEFAULT_REGION
            - aws cloudfront create-invalidation --distribution-id $PRODUCTION_CLOUDFRONT_DISTRIBUTION_ID --paths "/*"
      - step:
          name: Update Permission
          script:
            - chmod -R 775 storage
    preflight:
      - step:
          name: Run Test
          script:
            - composer install
            - npm install
            - npm run dev
            - export APP_ENV=testing DB_PASSWORD=$MYSQL_TEST_PASSWORD DB_USERNAME=root
            - cp .env.example .env
            - php artisan key:generate
            - npm test
            - vendor/bin/phpunit --log-junit ./test-reports/junit.xml
          caches:
            - composer-tmp #cache is defined below in the definitions section
      - step:
          name: Deploy To Preflight
          deployment: Preflight
          script:
            - apt-get update
            - apt-get install -y zip
            - apt-get update && apt-get install -y python3-pip
            - pip3 install awscli
            - apt-get install -y jq
            - composer install
            - npm install
            - cp .env.preflight .env
            - php artisan key:generate
            - php artisan storage:link
            - php artisan cache:clear
            - php artisan route:clear
            - php artisan config:clear
            - npm run prod
            - rm -R node_modules
            - zip -r gatheroo.zip .
            - pipe: atlassian/aws-code-deploy:0.2.5
              variables:
                AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
                AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                APPLICATION_NAME: $APPLICATION_NAME
                S3_BUCKET: $S3_BUCKET
                COMMAND: 'upload'
                ZIP_FILE: 'gatheroo.zip'
                VERSION_LABEL: 'gatheroo-preflight-1.0.0'
            - pipe: atlassian/aws-code-deploy:0.2.5
              variables:
                AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
                AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                APPLICATION_NAME: $APPLICATION_NAME
                DEPLOYMENT_GROUP: $PREFLIGHT_DEPLOYMENT_GROUP
                S3_BUCKET: $S3_BUCKET
                COMMAND: 'deploy'
                WAIT: 'true'
                VERSION_LABEL: 'gatheroo-preflight-1.0.0'
                IGNORE_APPLICATION_STOP_FAILURES: 'true'
                FILE_EXISTS_BEHAVIOR: 'OVERWRITE'
            - aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID
            - aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY
            - aws configure set default.region $AWS_DEFAULT_REGION
            - aws cloudfront create-invalidation --distribution-id $PREFLIGHT_CLOUDFRONT_DISTRIBUTION_ID --paths "/*"
      - step:
          name: Update Permission
          script:
            - chmod -R 777 .
            - chmod -R 775 .
    staging:
      - step:
          name: Run Test
          script:
            - composer install
            - npm install
            - export APP_ENV=testing DB_PASSWORD=$MYSQL_TEST_PASSWORD DB_USERNAME=root
            - cp .env.example .env
            - php artisan key:generate
            - npm test
            - vendor/bin/phpunit --log-junit ./test-reports/junit.xml
          caches:
            - composer-tmp #cache is defined below in the definitions section
      - step:
          name: Deploy To Staging
          deployment: Staging
          script:
            - apt-get update
            - apt-get install -y zip
            - apt-get update && apt-get install -y python3-pip
            - pip3 install awscli
            - apt-get install -y jq
            - composer install
            - npm install
            - cp .env.staging .env
            - php artisan key:generate
            - php artisan storage:link
            - php artisan cache:clear
            - php artisan route:clear
            - php artisan config:clear
            - npm run prod
            - rm -R node_modules
            - zip -r gatheroo.zip .
            - pipe: atlassian/aws-code-deploy:0.2.5
              variables:
                AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
                AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                APPLICATION_NAME: $APPLICATION_NAME
                S3_BUCKET: $S3_BUCKET
                COMMAND: 'upload'
                ZIP_FILE: 'gatheroo.zip'
                VERSION_LABEL: 'gatheroo-staging-1.0.0'
            - pipe: atlassian/aws-code-deploy:0.2.5
              variables:
                AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
                AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                APPLICATION_NAME: $APPLICATION_NAME
                DEPLOYMENT_GROUP: $STAGING_DEPLOYMENT_GROUP
                S3_BUCKET: $S3_BUCKET
                COMMAND: 'deploy'
                WAIT: 'true'
                VERSION_LABEL: 'gatheroo-staging-1.0.0'
                IGNORE_APPLICATION_STOP_FAILURES: 'true'
                FILE_EXISTS_BEHAVIOR: 'OVERWRITE'
            - aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID
            - aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY
            - aws configure set default.region $AWS_DEFAULT_REGION
            - aws cloudfront create-invalidation --distribution-id $STAGING_CLOUDFRONT_DISTRIBUTION_ID --paths "/*"
      - step:
          name: Update Permission
          script:
            - chmod -R 775 storage
definitions:
  caches:
    composer-tmp: /tmp
  services:
    mysql:
      image: mariadb:10.2
      variables:
        MYSQL_ALLOW_EMPTY_PASSWORD: 1
        MYSQL_DATABASE: gatheroo_test
        MYSQL_ROOT_PASSWORD: $MYSQL_TEST_PASSWORD

clone:
  depth: full

Thanks. I’ve flagged this for some expert eyes. In the meantime, I’d be curious if moving the pipes further up (before running composer) temporarily fixes the issue (I understand it’s not a final solution, if you want to get test/coverage data in there). It would at least help isolate some variables.

Hi Colin

We can merge the yml file, but we’re now getting the following 'The Bridge Server is Unresponsive"

We had this issue with the self-hosted approach, and the reason we switched to SonarCloud as it was supposed to be a pain-free option.

Here’s a copy of the logs if it helps.

java.lang.IllegalStateException: The bridge server is unresponsive  at org.sonar.plugins.javascript.bridge.BridgeServerImpl.request(BridgeServerImpl.java:415)  at org.sonar.plugins.javascript.bridge.BridgeServerImpl.analyzeJavaScript(BridgeServerImpl.java:369)  at org.sonar.plugins.javascript.analysis.AbstractAnalysis.analyzeFile(AbstractAnalysis.java:109)  at org.sonar.plugins.javascript.analysis.AnalysisWithProgram.analyzeProgram(AnalysisWithProgram.java:137)  at org.sonar.plugins.javascript.analysis.AnalysisWithProgram.analyzeFiles(AnalysisWithProgram.java:90)  at org.sonar.plugins.javascript.analysis.JsTsSensor.analyzeFiles(JsTsSensor.java:128)  at org.sonar.plugins.javascript.analysis.AbstractBridgeSensor.execute(AbstractBridgeSensor.java:77)  at org.sonar.scanner.sensor.AbstractSensorWrapper.analyse(AbstractSensorWrapper.java:63)  at org.sonar.scanner.sensor.ModuleSensorsExecutor.execute(ModuleSensorsExecutor.java:75)  at org.sonar.scanner.sensor.ModuleSensorsExecutor.lambda$execute$1(ModuleSensorsExecutor.java:48)  at org.sonar.scanner.sensor.ModuleSensorsExecutor.withModuleStrategy(ModuleSensorsExecutor.java:66)  at org.sonar.scanner.sensor.ModuleSensorsExecutor.execute(ModuleSensorsExecutor.java:48)  at org.sonar.scanner.scan.ModuleScanContainer.doAfterStart(ModuleScanContainer.java:64)  at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)  at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)  at org.sonar.scanner.scan.ProjectScanContainer.scan(ProjectScanContainer.java:192)  at org.sonar.scanner.scan.ProjectScanContainer.scanRecursively(ProjectScanContainer.java:188)  at org.sonar.scanner.scan.ProjectScanContainer.doAfterStart(ProjectScanContainer.java:159)  at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)  at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)  at org.sonar.scanner.bootstrap.ScannerContainer.doAfterStart(ScannerContainer.java:416)  at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)  at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)  at org.sonar.scanner.bootstrap.GlobalContainer.doAfterStart(GlobalContainer.java:128)  at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)  at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)  at org.sonar.scanner.bootstrap.ScannerMain.runScannerEngine(ScannerMain.java:135)  at org.sonar.scanner.bootstrap.ScannerMain.run(ScannerMain.java:52)  at org.sonar.scanner.bootstrap.ScannerMain.main(ScannerMain.java:38)Caused by: java.io.IOException: HTTP/1.1 header parser received no bytes  at java.net.http/jdk.internal.net.http.HttpClientImpl.send(Unknown Source)  at java.net.http/jdk.internal.net.http.HttpClientFacade.send(Unknown Source)  at org.sonar.plugins.javascript.bridge.BridgeServerImpl.request(BridgeServerImpl.java:406)  ... 28 common frames omittedCaused by: java.io.IOException: HTTP/1.1 header parser received no bytes  at java.net.http/jdk.internal.net.http.common.Utils.wrapWithExtraDetail(Unknown Source)  at java.net.http/jdk.internal.net.http.Http1Response$HeadersReader.onReadError(Unknown Source)  at java.net.http/jdk.internal.net.http.Http1AsyncReceiver.checkForErrors(Unknown Source)  at java.net.http/jdk.internal.net.http.Http1AsyncReceiver.flush(Unknown Source)  at java.net.http/jdk.internal.net.http.common.SequentialScheduler$LockingRestartableTask.run(Unknown Source)  at java.net.http/jdk.internal.net.http.common.SequentialScheduler$CompleteRestartableTask.run(Unknown Source)  at java.net.http/jdk.internal.net.http.common.SequentialScheduler$SchedulableTask.run(Unknown Source)  at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)  at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)  at java.base/java.lang.Thread.run(Unknown Source)Caused by: java.io.EOFException: EOF reached while reading  at java.net.http/jdk.internal.net.http.Http1AsyncReceiver$Http1TubeSubscriber.onComplete(Unknown Source)  at java.net.http/jdk.internal.net.http.SocketTube$InternalReadPublisher$ReadSubscription.signalCompletion(Unknown Source)  at java.net.http/jdk.internal.net.http.SocketTube$InternalReadPublisher$InternalReadSubscription.read(Unknown Source)  at java.net.http/jdk.internal.net.http.SocketTube$SocketFlowTask.run(Unknown Source)  at java.net.http/jdk.internal.net.http.common.SequentialScheduler$SchedulableTask.run(Unknown Source)  at java.net.http/jdk.internal.net.http.common.SequentialScheduler.runOrSchedule(Unknown Source)  at java.net.http/jdk.internal.net.http.common.SequentialScheduler.runOrSchedule(Unknown Source)  at java.net.http/jdk.internal.net.http.SocketTube$InternalReadPublisher$InternalReadSubscription.signalReadable(Unknown Source)  at java.net.http/jdk.internal.net.http.SocketTube$InternalReadPublisher$ReadEvent.signalEvent(Unknown Source)  at java.net.http/jdk.internal.net.http.SocketTube$SocketFlowEvent.handle(Unknown Source)  at java.net.http/jdk.internal.net.http.HttpClientImpl$SelectorManager.handleEvent(Unknown Source)  at java.net.http/jdk.internal.net.http.HttpClientImpl$SelectorManager.lambda$run$3(Unknown Source)  at java.base/java.util.ArrayList.forEach(Unknown Source)  at java.net.http/jdk.internal.net.http.HttpClientImpl$SelectorManager.run(Unknown Source)06:06:01.067 INFO  Hit the cache for 0 out of 106:06:01.068 INFO  Miss the cache for 1 out of 1: ANALYSIS_MODE_INELIGIBLE [1/1]06:06:01.805 ERROR Error during SonarScanner Engine executionjava.lang.IllegalStateException: Analysis of JS/TS files failed  at org.sonar.plugins.javascript.analysis.AbstractBridgeSensor.execute(AbstractBridgeSensor.java:102)  at org.sonar.scanner.sensor.AbstractSensorWrapper.analyse(AbstractSensorWrapper.java:63)  at org.sonar.scanner.sensor.ModuleSensorsExecutor.execute(ModuleSensorsExecutor.java:75)  at org.sonar.scanner.sensor.ModuleSensorsExecutor.lambda$execute$1(ModuleSensorsExecutor.java:48)  at org.sonar.scanner.sensor.ModuleSensorsExecutor.withModuleStrategy(ModuleSensorsExecutor.java:66)  at org.sonar.scanner.sensor.ModuleSensorsExecutor.execute(ModuleSensorsExecutor.java:48)  at org.sonar.scanner.scan.ModuleScanContainer.doAfterStart(ModuleScanContainer.java:64)  at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)  at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)  at org.sonar.scanner.scan.ProjectScanContainer.scan(ProjectScanContainer.java:192)  at org.sonar.scanner.scan.ProjectScanContainer.scanRecursively(ProjectScanContainer.java:188)  at org.sonar.scanner.scan.ProjectScanContainer.doAfterStart(ProjectScanContainer.java:159)  at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)  at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)  at org.sonar.scanner.bootstrap.ScannerContainer.doAfterStart(ScannerContainer.java:416)  at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)  at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)  at org.sonar.scanner.bootstrap.GlobalContainer.doAfterStart(GlobalContainer.java:128)  at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:123)  at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:109)  at org.sonar.scanner.bootstrap.ScannerMain.runScannerEngine(ScannerMain.java:135)  at org.sonar.scanner.bootstrap.ScannerMain.run(ScannerMain.java:52)  at org.sonar.scanner.bootstrap.ScannerMain.main(ScannerMain.java:38)Caused by: java.lang.IllegalStateException: The bridge server is unresponsive  at org.sonar.plugins.javascript.bridge.BridgeServerImpl.request(BridgeServerImpl.java:415)  at org.sonar.plugins.javascript.bridge.BridgeServerImpl.analyzeJavaScript(BridgeServerImpl.java:369)  at org.sonar.plugins.javascript.analysis.AbstractAnalysis.analyzeFile(AbstractAnalysis.java:109)  at org.sonar.plugins.javascript.analysis.AnalysisWithProgram.analyzeProgram(AnalysisWithProgram.java:137)  at org.sonar.plugins.javascript.analysis.AnalysisWithProgram.analyzeFiles(AnalysisWithProgram.java:90)  at org.sonar.plugins.javascript.analysis.JsTsSensor.analyzeFiles(JsTsSensor.java:128)  at org.sonar.plugins.javascript.analysis.AbstractBridgeSensor.execute(AbstractBridgeSensor.java:77)  ... 22 common frames omittedCaused by: java.io.IOException: HTTP/1.1 header parser received no bytes  at java.net.http/jdk.internal.net.http.HttpClientImpl.send(Unknown Source)  at java.net.http/jdk.internal.net.http.HttpClientFacade.send(Unknown Source)  at org.sonar.plugins.javascript.bridge.BridgeServerImpl.request(BridgeServerImpl.java:406)  ... 28 common frames omittedCaused by: java.io.IOException: HTTP/1.1 header parser received no bytes  at java.net.http/jdk.internal.net.http.common.Utils.wrapWithExtraDetail(Unknown Source)  at java.net.http/jdk.internal.net.http.Http1Response$HeadersReader.onReadError(Unknown Source)  at java.net.http/jdk.internal.net.http.Http1AsyncReceiver.checkForErrors(Unknown Source)  at java.net.http/jdk.internal.net.http.Http1AsyncReceiver.flush(Unknown Source)  at java.net.http/jdk.internal.net.http.common.SequentialScheduler$LockingRestartableTask.run(Unknown Source)  at java.net.http/jdk.internal.net.http.common.SequentialScheduler$CompleteRestartableTask.run(Unknown Source)  at java.net.http/jdk.internal.net.http.common.SequentialScheduler$SchedulableTask.run(Unknown Source)  at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)  at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)  at java.base/java.lang.Thread.run(Unknown Source)Caused by: java.io.EOFException: EOF reached while reading  at java.net.http/jdk.internal.net.http.Http1AsyncReceiver$Http1TubeSubscriber.onComplete(Unknown Source)  at java.net.http/jdk.internal.net.http.SocketTube$InternalReadPublisher$ReadSubscription.signalCompletion(Unknown Source)  at java.net.http/jdk.internal.net.http.SocketTube$InternalReadPublisher$InternalReadSubscription.read(Unknown Source)  at java.net.http/jdk.internal.net.http.SocketTube$SocketFlowTask.run(Unknown Source)  at java.net.http/jdk.internal.net.http.common.SequentialScheduler$SchedulableTask.run(Unknown Source)  at java.net.http/jdk.internal.net.http.common.SequentialScheduler.runOrSchedule(Unknown Source)  at java.net.http/jdk.internal.net.http.common.SequentialScheduler.runOrSchedule(Unknown Source)  at java.net.http/jdk.internal.net.http.SocketTube$InternalReadPublisher$InternalReadSubscription.signalReadable(Unknown Source)  at java.net.http/jdk.internal.net.http.SocketTube$InternalReadPublisher$ReadEvent.signalEvent(Unknown Source)  at java.net.http/jdk.internal.net.http.SocketTube$SocketFlowEvent.handle(Unknown Source)  at java.net.http/jdk.internal.net.http.HttpClientImpl$SelectorManager.handleEvent(Unknown Source)  at java.net.http/jdk.internal.net.http.HttpClientImpl$SelectorManager.lambda$run$3(Unknown Source)  at java.base/java.util.ArrayList.forEach(Unknown Source)  at java.net.http/jdk.internal.net.http.HttpClientImpl$SelectorManager.run(Unknown Source)

SonarQube Cloud still uses the same analysis engine as SonarQube, so I don’t expect that migrating to it would solve this issue.

Typically the latest issue you’ve referenced is solved by bumping up the memory available to the pipeline (see here and here).