Heap size limit

Hello.

The sonarsource/sonarcloud-github-action@v3
is failing
with the following heap size limit error:

14:21:51.534 ERROR The analysis will stop due to the Node.js process running out of memory (heap size limit 6192 MB)

actions/runs/10491530779/job/29060890170#step:19:156)14:21:51.535 ERROR You can see how Node.js heap usage evolves during analysis with “sonar.javascript.node.debugMemory=true”

/actions/runs/10491530779/job/29060890170#step:19:157)14:21:51.535 ERROR Try setting “sonar.javascript.node.maxspace” to a higher value to increase Node.js heap size limit
/actions/runs/10491530779/job/29060890170#step:19:158)14:21:51.535 ERROR If the problem persists, please report the issue at https://community.sonarsource.com

/actions/runs/10491530779/job/29060890170#step:19:159)14:21:51.554 INFO Hit the cache for 0 out of 81

/actions/runs/10491530779/job/29060890170#step:19:160)14:21:51.553 ERROR Failure during analysis

actions/runs/10491530779/job/29060890170#step:19:161)java.lang.IllegalStateException: The bridge server is unresponsive

I try to set sonar.javascript.node.maxspace=7168
but its still failing

the runner is ubuntu-latest. We use node.js 20

any idea?

It probably depends on the runner you use:

The Linux runners for public repos are getting 16GB of RAM but for private this is only 7GB of RAM. So if your NodeJS analysis needs more than 5-6GB to work you will be in trouble.

There is no memory profiling provided by GH actions but something you could try is to run top in batch mode while the scanner is running. Write the output of top in a top.log file and archive the file as an artifact. This will give you a lot of insight about the actual processes running while the scanner is running as well as an insight into what is consuming RSS, but not what is consuming cache and kernel memory (these do limit RSS memory).

If your repo only has NodeJS code to be analyzed, I would look at limiting the java heap to maximize the amount of RAM available for NodeJS.

For our own calls to the scanner we do this (not the GH action way but this give you an example of the settings you can use):

export SONAR_SCANNER_JAVA_OPTS='-XX:MinRAMPercentage=60 -XX:MaxRAMPercentage=60 -XX:+UseCompressedClassPointers -XshowSettings:vm'

This will allow you to tune the RAM that the java (not javascript) process will be using. You already tried to tune sonar.javascript.node.maxspace. If you could share your GH workflow yaml, or at least the section where you pass the parameter that would allow others to review if the parameter will be picked up.

If you are on private repos and 7GB of RAM is not enough you will have to either pay for larger runners (max is 32GB RAM IIRC), or setup your own infrastructure for your runners.

hello Stephane,

Thanks for looking into this
I tried with your suggested settings

  - name: SonarCloud Scan
        uses: sonarsource/sonarcloud-github-action@v3
              
        env:
                GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
                SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
               
                SONAR_EXCLUSIONS: "**/*.spec.js,azure-functions/node_modules/,src/tenants/**"
                SONAR_SCANNER_JAVA_OPTS: '-XX:MinRAMPercentage=60 -XX:MaxRAMPercentage=60 -XX:+UseCompressedClassPointers -XshowSettings:vm'


but go the same result  :


Here's the relevant part of the GH log:

19:44:23.910 INFO Scanner configuration file: /opt/sonar-scanner/conf/sonar-scanner.properties

[16](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:17)19:44:23.914 INFO Project root configuration file: /github/workspace/sonar-project.properties

[17](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:18)19:44:23.931 INFO SonarScanner CLI 6.1.0.4477

[18](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:19)19:44:23.932 INFO Java 17.0.11 Eclipse Adoptium (64-bit)

[19](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:20)19:44:23.933 INFO Linux 6.5.0-1025-azure amd64

[20](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:21)19:44:23.956 INFO User cache: /opt/sonar-scanner/.sonar/cache

[21](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:22)19:44:24.841 INFO JRE provisioning: os[linux], arch[x86_64]

[22](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:23)19:44:29.184 INFO Communicating with SonarCloud

[23](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:24)19:44:29.216 ERROR [stderr] VM settings:

[24](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:25)19:44:29.218 ERROR [stderr] Max. Heap Size (Estimated): 4.65G

[25](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:26)19:44:29.218 ERROR [stderr] Using VM: OpenJDK 64-Bit Server VM

[26](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:27)19:44:29.219 ERROR [stderr]

[27](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:28)19:44:29.487 INFO Starting SonarScanner Engine...

[28](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:29)19:44:29.488 INFO Java 17.0.11 Eclipse Adoptium (64-bit)

[29](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:30)19:44:30.665 INFO Load global settings

[30](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:31)19:44:31.679 INFO Load global settings (done) | time=1015ms

.....

19:53:44.685 INFO 90/170 files analyzed, current file: src/tenants/RegisterTenantResolver.js

[177](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:178)19:53:50.639 ERROR Failure during analysis

[178](https://github.com/.../actions/runs/10566186475/job/29272373300#step:12:179)java.lang.IllegalStateException: The bridge server is unresponsive

......

Caused by: java.net.http.HttpTimeoutException: request timed out
  at java.net.http/jdk.internal.net.http.HttpClientImpl.send(Unknown Source)
  at java.net.http/jdk.internal.net.http.HttpClientFacade.send(Unknown Source)
  at org.sonar.plugins.javascript.bridge.BridgeServerImpl.request(BridgeServerImpl.java:396)
  ... 28 common frames omitted

19:53:50.706 INFO  Hit the cache for 0 out of 90
19:53:52.494 INFO  Miss the cache for 90 out of 90: ANALYSIS_MODE_INELIGIBLE [90/90]
Error: The operation was canceled.

Im starting to think i need a runner with more memory but that’s the only project that has this issue.

-David

Using my example as-is gave the java process 60% of the memory, which is not what you need. However that gave us this interesting insight from the logs:

Max. Heap Size (Estimated): 4.65G

Which means that you are running with about 7GB of RAM max for the runner. In your prior run it looks like nodeJS was able to get about 6GB of RAM out of the worker node with about 1GB used by the sonar scanner. So it is pretty clear that your repo needs more than 6GB for nodeJS to be analyzed.

In your case you would probably want something like this:

SONAR_SCANNER_JAVA_OPTS: '-XX:MaxRAM=0.5G -XX:+UseCompressedClassPointers -XshowSettings:vm -Dsonar.javascript.node.maxspace=6400'

Note that I’m neither a Java or JavaScript expert, I do believe that you can do -XX:MaxRAM=0.5G but it might only accept integers so you would have to do -XX:MaxRAM=512M

But if your problem is that the NodeJS process will need more than what the runner’s physical RAM is, then you will have to update your CI with something that can give you that memory.

Personally I avoid GitHub Actions as much as possible for private repositories because it is very expensive at scale for private repositories, and has proven rather unstable (see their status history for the past year). We run our internal CI pipelines in Jenkins that runs on AWS EKS which fully autoscale up/down based on load with the help of Karpenter. To be honest it is not a trivial thing to setup.

Others might be able to help you on how to get the sonar analysus to run under a 7GB worker. A potential option to to not run Sonar Scanner and let SonarCloud do the analysis for you. You won’t get Code Coverage reports but you should get static analysis.

Maybe a SonarCloud representative can give you other tips on how to get your NodeJS project analyzed.

Links:

1 Like

Hello @David-M-Baxter,

Sorry for the late response.

Can you share some details about your project? Is it a monorepo? do you have any tsconfig.json files? how many?

Cheers,
Victor