We are experiencing a timeout issue on one of our projects using the SonarQube Cloud Analyze step. No changes have been made recently to our pipeline or any code surrounding the pipeline. The analysis seems to stop at the JavaScript/Typescript analysis step.
Relevant line in azure-pipelines.yml:
- task: SonarCloudAnalyze@3
inputs:
jdkversion: 'JAVA_HOME_17_X64'
Relevant logs (check the timestamp near the bottom, the build cancels itself out after 60 minutes):
2026-03-02T14:24:26.9561704Z INFO: Sensor JavaScript/TypeScript analysis [javascript]
2026-03-02T14:24:27.5523057Z INFO: Detected os: Windows Server 2025 arch: amd64 alpine: false. Platform: WIN_X64
2026-03-02T14:24:27.5529483Z INFO: Deploy location C:\Users\VssAdministrator\.sonar\js\node-runtime, tagetRuntime: C:\Users\VssAdministrator\.sonar\js\node-runtime\node.exe, version: C:\Users\VssAdministrator\.sonar\js\node-runtime\version.txt
2026-03-02T14:24:30.2442478Z INFO: Using embedded Node.js runtime.
2026-03-02T14:24:30.2444187Z INFO: Using Node.js executable: 'C:\Users\VssAdministrator\.sonar\js\node-runtime\node.exe'.
2026-03-02T14:24:32.8493916Z INFO: Memory configuration: OS (8186 MB), Node.js (2240 MB).
2026-03-02T14:24:33.0227464Z INFO: WebSocket client connected on /ws
2026-03-02T14:24:33.0230003Z INFO: Plugin version: [12.0.0.38664]
2026-03-02T14:24:36.1146875Z INFO: Some of the project files were automatically excluded because they looked like generated code. Enable debug logging to see which files were excluded. You can disable bundle detection by setting sonar.javascript.detectBundles=false
2026-03-02T14:24:36.4280765Z INFO: Found 2 tsconfig.json file(s): [D:/a/1/s/{{ProjectName}}/ClientApp/tsconfig.json, D:/a/1/s/{{ProjectName}}/bin/Release/net8.0/ClientApp/tsconfig.json]
2026-03-02T15:15:30.8098802Z ##[error]The Operation will be canceled. The next steps may not contain expected logs.
2026-03-02T15:15:31.1151471Z ##[error]The operation was canceled.
2026-03-02T15:15:31.1158528Z ##[section]Finishing: SonarCloudAnalyze
I suspect that there may be some code that is being analyzed twice. I noticed that our /bin/ folder is being included, though it seems like we have set it to be excluded.
2026-03-02T14:24:14.1846790Z INFO: Indexing files of module '{{ProjectName}}'
2026-03-02T14:24:14.1853449Z INFO: Base dir: D:\a\1\s\MarketingSchools\mySchool
2026-03-02T14:24:14.4318895Z INFO: Source paths: appsettings.json, appsettings.PROD.json, appsettings.QA.json,...
2026-03-02T14:24:14.4324640Z INFO: Included sources: {{ProjectName}}/**
2026-03-02T14:24:14.4327397Z INFO: Excluded sources: **/bin/**, **/obj/**, **/wwwroot/**, **/node_modules/**, **/bin/Release/net8.0/ClientApp/src/app/**, **/angular/cache/19.2.0/babel-webpack/**, **/build-wrapper-dump.json
2026-03-02T14:24:14.4328308Z INFO: Excluded sources for coverage: *
2026-03-02T14:24:14.4401828Z WARN: Specifying module-relative paths at project level in the property 'sonar.coverage.exclusions' is deprecated. To continue matching files like '{{ProjectName}}/appsettings.json', update this property so that patterns refer to project-relative paths.
2026-03-02T14:24:14.6872952Z INFO: Indexing files of module '{{ProjectName}}'
2026-03-02T14:24:14.6873880Z INFO: Base dir: D:\a\1\s
2026-03-02T14:24:14.6875040Z INFO: Source paths: {{AnotherProjectName}}/azure-pipelines.yml, MarketingSchools/azure...
2026-03-02T14:24:14.6876225Z INFO: Test paths: {{ProjectName}}/bin/Release/net8.0/ClientApp/src/app/...
2026-03-02T14:24:14.6886866Z INFO: Included sources: {{ProjectName}}/**
2026-03-02T14:24:14.6887642Z INFO: Excluded sources: **/bin/**, **/obj/**, **/wwwroot/**, **/node_modules/**, **/bin/Release/net8.0/ClientApp/src/app/**, **/angular/cache/19.2.0/babel-webpack/**, **/build-wrapper-dump.json
2026-03-02T14:24:14.6888345Z INFO: Excluded sources for coverage: *
2026-03-02T14:24:14.6904095Z INFO: 1272 files indexed (done) | time=693ms
Any suggestions on how to fix this time out issue? It is currently only affecting 1 of our several projects that use the SonarQube Cloud for analysis and it seems that nothing has changed on our end that would have affected this process. We started experiencing issues earlier this week.