When we analyse our node app with a workflow, the process reports a out of memory errors, and hangs. But the workflow keeps running, eating up billing time!
Right before the memory error a long repeating list of the same message is seen:
10:31:48.219 INFO 4/5 source files have been analyzed
10:32:00.338 INFO 4/5 source files have been analyzed
[etc, etc…]
Why do we get an out of memory error, and why doesnt the workflow exit when this is happening?
In the logs below, the error occurs at 2025-01-30T09:07:28.3308665Z
I cancelled the workflow manually at 2025-01-30T09:48:46.5385415
2025-01-30T09:07:28.3308665Z 09:07:28.328 ERROR [stderr] Exception in thread "Report about progress of SCM blame" java.lang.OutOfMemoryError: Java heap space
2025-01-30T09:07:28.3318469Z 09:07:28.329 ERROR [stderr] at ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:424)
2025-01-30T09:07:28.3320040Z 09:07:28.329 ERROR [stderr] at ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:386)
2025-01-30T09:07:28.3326659Z 09:07:28.329 ERROR [stderr] at ch.qos.logback.classic.Logger.info(Logger.java:584)
2025-01-30T09:07:28.3328189Z 09:07:28.332 WARN Missing blame information for the following files:
2025-01-30T09:07:28.3329584Z 09:07:28.329 ERROR [stderr] at org.sonar.scanner.util.ProgressReport.log(ProgressReport.java:71)
2025-01-30T09:07:28.3330425Z 09:07:28.332 WARN * package-lock.json
2025-01-30T09:07:28.3331027Z 09:07:28.332 WARN This may lead to missing/broken features in SonarCloud
2025-01-30T09:07:28.3348303Z 09:07:28.329 ERROR [stderr] at org.sonar.scanner.util.ProgressReport.run(ProgressReport.java:33)
2025-01-30T09:07:28.3349540Z 09:07:28.329 ERROR [stderr] at java.base/java.lang.Thread.run(Unknown Source)
2025-01-30T09:08:20.6103716Z 09:08:20.610 ERROR [stderr] Exception in thread "pool-4-thread-1" java.lang.OutOfMemoryError: Java heap space
2025-01-30T09:08:20.9756755Z 09:08:20.975 ERROR [stderr] Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
2025-01-30T09:48:46.5385415Z ##[error]The operation was canceled.
2025-01-30T09:48:46.5455509Z Post job cleanup.
2025-01-30T09:48:46.5498383Z Post job cleanup.
2025-01-30T09:48:46.6238034Z [command]/usr/bin/git version
2025-01-30T09:48:46.6277659Z git version 2.48.1
2025-01-30T09:48:46.6324773Z Temporarily overriding HOME='/home/runner/work/_temp/ba6365d6-97e2-4102-8c7f-fe31f426a8c5' before making global git config changes
2025-01-30T09:48:46.6326581Z Adding repository directory to the temporary git global config as a safe directory
2025-01-30T09:48:46.6329881Z [command]/usr/bin/git config --global --add safe.directory /home/runner/work/app/app
2025-01-30T09:48:46.6364284Z [command]/usr/bin/git config --local --name-only --get-regexp core\.sshCommand
2025-01-30T09:48:46.6396204Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'core\.sshCommand' && git config --local --unset-all 'core.sshCommand' || :"
2025-01-30T09:48:46.6655771Z [command]/usr/bin/git config --local --name-only --get-regexp http\.https\:\/\/github\.com\/\.extraheader
2025-01-30T09:48:46.6679263Z http.https://github.com/.extraheader
2025-01-30T09:48:46.6693088Z [command]/usr/bin/git config --local --unset-all http.https://github.com/.extraheader
2025-01-30T09:48:46.6725585Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'http\.https\:\/\/github\.com\/\.extraheader' && git config --local --unset-all 'http.https://github.com/.extraheader' || :"
2025-01-30T09:48:46.7080605Z Cleaning up orphan processes
2025-01-30T09:48:46.7261749Z Terminate orphan process: pid (1757) (run-sonar-scanner-cli.sh)
2025-01-30T09:48:46.7282202Z Terminate orphan process: pid (1758) (java)
2025-01-30T09:48:46.7313862Z Terminate orphan process: pid (1816) (java)