PHPStorm with SonarLint Plugin, it works with php files, but trying to analyze any typescript file in my Angular project (a large one) I always have this error:
<--- Last few GCs --->
[91167:0x130008000] 35136 ms: Mark-sweep 4046.4 (4129.0) -> 4041.8 (4142.5) MB, 627.9 / 0.0 ms (average mu = 0.744, current mu = 0.158) allocation failure; scavenge might not succeed
[91167:0x130008000] 36317 ms: Mark-sweep 4051.7 (4143.6) -> 4048.4 (4147.8) MB, 1125.6 / 0.0 ms (average mu = 0.500, current mu = 0.047) allocation failure; scavenge might not succeed
<--- JS stacktrace --->
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
1: 0x100c88794 node::Abort() [/Users/user.name/.nvm/versions/node/v18.13.0/bin/node]
...
...
Operating system:
MacOS Ventura 13.2.1
SonarLint plugin version:
8.1.0.65508
Programming language you’re coding in:
Typescript (Angular project)
Is connected mode used:
Connected to SonarCloud or SonarQube (and which version):
The same error is with no connection or connected to SonarQube server Enterprise Edition Version 9.8 (build 63668)
The log shows Node.js is running out of memory at around 4gb (which is the default for your Node.js version), apparently this project needs more. This is independent from JVM memory limits and settings.
Node.js has limited heap memory by default. Assuming you have enough memory in the machine, you need to tell the scanner to allocate more memory in this case.
The scanner provides a parameter to allow more memory use via sonar.javascript.node.maxspace:
You can use sonar.javascript.node.maxspace property to allow the analysis to use more memory. Set this property to 4096 or 8192 for big projects. This property should be set in sonar-project.properties file or on command line for scanner (with -Dsonar.javascript.node.maxspace=4096).
It’s a very large multirepo Angular project. The main repo has about 4000 .ts files (130 Mb). We use ESLint also, and it needs the same node memory allocation to works properly.