Scanner command used when applicable: I’d say only params without our IDs are
SONAR_SCANNER_OPTS=-Xms2G -Xmx16G
-X for debug output
We run sonar-scanner in Docker.
Languages of the repository: JavaScript, TypeScript
Error observed:
Exit code 143 of docker container.
Project is ~1M SLOC.
Last logs in sonar-scanner container
11:06:08.262 DEBUG Cache entry created for key 'js:filemetadata:11.0.0.33655:<redacted>'
11:06:08.263 DEBUG Accepted file: <redacted>
and after that scanner lives for around 2 hours, after which it dies with 143 exit code.
issue only happens on main branch, PRs are working OK for whatever reason.
I’ve tried bumping XmX to no avail. Really no idea where to proceed because debug gave me nothing.
They are empty, because we try to set them at all times for PR analysis. We also do not set a branch in any way, shape or form other than that. However, in the past things worked fine. Can this be related?
We are seeing the same issue with one of our js projects. And we don’t set these sonar.pullrequest.* parameters.
From my side i can only add that we have seen the analysis fail on a project with 15k LOC on July 9th and did have a successful analysis before on July 4th (with no changes in the code base or on the CI)
Could you provide your debug (-Dsonar.verbose=true on the analysis command line) analysis logs, redacted as necessary?
The analysis / scanner log is what’s output from the analysis command. Hopefully, the log you provide - redacted as necessary - will include that command as well.
i have created a redacted and shortened version of the -X/verbose output that also includes the last logs as Roman has posted.
I also enabled NODE debug logs using DEBUG=* when calling the sonar-scanner. That reveals some more logs of what happens after the main sonar-scanner thread gets stuck.
This seems to be broken since the SonarJS release 11.0.0.33655 and we had to temporarily disable the sonar-analysis to unblock our CI jobs so I hope this gets resolved soon.
I have been testing more things and wanted to give the update that we only see it failing if we include test files in the analysis. And the last ‘Accepted file’ before it gets stuck was always a *.test.js file
sadly no change. I also tried with 16gb. I do see the new lines
15:01:08.147 INFO Configured Node.js --max-old-space-size=8192.
15:01:08.147 INFO Using embedded Node.js runtime.
...
15:01:08.845 INFO Memory configuration: OS (36864 MB), Node.js (8240 MB).
would it be possible to access the full debug logs? Or is it a private repo and you prefer no to share all of them? in that case, would it possible to share them privately?
can you please try setting the sonar property sonar.javascript.node.maxspace=8192 ?
Victor, we’ve had this param set in our sonar-project.properties file for ~3 years. Sorry for not indicating this earlier.
Here’s redacted sonar-project.properties:
# must be unique in a given SonarQube instance
sonar.projectKey=<company>
sonar.sources=.
sonar.tests=.
sonar.exclusions=<redacted>
sonar.inclusions=<redacted>
sonar.test.inclusions=<redacted>
sonar.test.exclusions=<redacted>
sonar.scm.exclusions.disabled=true
sonar.javascript.exclusions=''
sonar.javascript.lcov.reportPaths=backend/coverage/report/lcov.info,packages/**/coverage/report/lcov.info
sonar.javascript.node.maxspace=8192
would it be possible to access the full debug logs? Or is it a private repo and you prefer no to share all of them? in that case, would it possible to share them privately?
Not a problem for me at all to do this privately. Going through stuff for public upload would be tough.
Unfortunately, removing this line from properties did not help. We’re having the same issue and the last logs are quite similar to previous fails. I’ll provide the latest logs in PM.
Thanks @victor.diez for coming back to our issue with this.
I re-checked and we still run into the same issue of ‘Accepted file: …’ and then stuck.
But i think I might have found the cause in our project.
We do have a directory test_helper, that is symlinked into other test directories. We usually symlink the whole directory which is then ignored by sonar-scanner (it does not show up in the code section)
But in one test directory we only symlinked a few test_helper/* files.
When I symlinked the whole test_helper directory instead the analysis went through again.
I will try to reproduce this issue with an example project.
@roman-belkov does your project use symlinked files too?
Sadly i have been unable to create a smaller example project to reproduce the issue.
For now we are unblocked because the full project sonar is successfully scanned again since changing the symlinked files to a symlinked directory that is ignored by sonar.
If i come across a sample project that has the same issue, i would report back.
i have been able to pin this issue down to one single JavaScript rule and could isolate the lines of code that trigger the issue.
The issue only appears when javascript:S2699 is enabled “Tests should include assertions”.
I have run my example and full project against the Quality Profile “Sonar way” with javascript:S2699 deactivated. With that change, the analysis goes through without issue (@roman-belkov are you able to reproduce that?).
The implementation that triggers this behavior seems to be related to process.env.VARIABLE usage in test files where supertest is used.
analyising just this file causes the analysis to get stuck:
"use strict";
// without these imports javascript:S2699 seems to be ignored. but in this example we don't need to use the imports
const superagentDefaults = require("superagent-defaults");
const supertest = require("supertest");
// this line doens't have to be in this file. it always causes an issue.
process.env.VARIABLE = 'some-token';
describe("fail", () => {
it("should cause issues", () => {
const test_input = process.env.OTHER_VARIABLE.substring(0, 6); // this line can use any process.env var.
});
});