StackOverflowError suddenly in golang scanner

We utilize sonar cloud with github actions to run against our code base. Recently in our go code base it’s suddenly failing with a cyclical StackOverflowError:

20:05:31.164 INFO  Scanner configuration file: /opt/sonar-scanner/conf/sonar-scanner.properties
20:05:31.169 INFO  Project root configuration file: /usr/src/sonar-project.properties
20:05:31.182 INFO  SonarScanner CLI 7.0.2.4839
20:05:31.184 INFO  Java 17.0.14 Amazon.com Inc. (64-bit)
20:05:31.184 INFO  Linux 5.15.167.4-microsoft-standard-WSL2 amd64
20:05:31.194 DEBUG Scanner max available memory: 3 GB
20:05:31.223 DEBUG uname -m returned 'x86_64'
20:05:31.226 DEBUG Using JVM default truststore: /usr/lib/jvm/java-17-amazon-corretto.x86_64/lib/security/cacerts
20:05:31.227 DEBUG Create: /opt/sonar-scanner/.sonar/cache
...
20:06:07.712 DEBUG Sensors : JaCoCo XML Report Importer -> Java Config Sensor -> Code Quality and Security for Go -> Go Cover sensor for Go coverage -> IaC Docker Sensor -> EnterpriseTextAndSecretsSensor
20:06:07.712 INFO  Sensor JaCoCo XML Report Importer [jacoco]
20:06:07.713 INFO  'sonar.coverage.jacoco.xmlReportPaths' is not defined. Using default locations: target/site/jacoco/jacoco.xml,target/site/jacoco-it/jacoco.xml,build/reports/jacoco/test/jacocoTestReport.xml
20:06:07.714 INFO  No report imported, no coverage information will be imported by JaCoCo XML Report Importer
20:06:07.714 INFO  Sensor JaCoCo XML Report Importer [jacoco] (done) | time=2ms
20:06:07.714 INFO  Sensor Java Config Sensor [iac]
20:06:07.733 INFO  0 source files to be analyzed
20:06:07.737 INFO  0/0 source files have been analyzed
20:06:07.737 INFO  Sensor Java Config Sensor [iac] (done) | time=23ms
20:06:07.737 INFO  Sensor Code Quality and Security for Go [goenterprise]
20:06:07.740 INFO  1137 source files to be analyzed
..
20:06:10.167 DEBUG 'xxx.go' generated metadata with charset 'UTF-8'
20:06:10.189 ERROR [stderr] Exception in thread "main" java.lang.StackOverflowError
20:06:10.190 ERROR [stderr]     at org.sonar.go.utils.SymbolHelper.resolveValue(SymbolHelper.java:98)
20:06:10.190 ERROR [stderr]     at com.sonar.go.A.X.Y(QueryUsageCheck.java:57)
20:06:10.190 ERROR [stderr]     at com.sonar.go.A.X.X(QueryUsageCheck.java:76)
20:06:10.190 ERROR [stderr]     at com.sonar.go.A.X.W(QueryUsageCheck.java:63)
20:06:10.190 ERROR [stderr]     at com.sonar.go.A.X.Y(QueryUsageCheck.java:58)
20:06:10.190 ERROR [stderr]     at com.sonar.go.A.X.X(QueryUsageCheck.java:76)
20:06:10.190 ERROR [stderr]     at com.sonar.go.A.X.W(QueryUsageCheck.java:63)
20:06:10.190 ERROR [stderr]     at com.sonar.go.A.X.Y(QueryUsageCheck.java:58)
20:06:10.190 ERROR [stderr]     at com.sonar.go.A.X.X(QueryUsageCheck.java:76)
20:06:10.190 ERROR [stderr]     at com.sonar.go.A.X.W(QueryUsageCheck.java:63)
20:06:10.190 ERROR [stderr]     at com.sonar.go.A.X.Y(QueryUsageCheck.java:58)
20:06:10.190 ERROR [stderr]     at com.sonar.go.A.X.X(QueryUsageCheck.java:76)
20:06:10.190 ERROR [stderr]     at com.sonar.go.A.X.W(QueryUsageCheck.java:63)

I can also reproduce this locally running the scanner via docker. There’s a pretty clear infinite recursion loop here so it’s not memory or stack space.

This basically blocks our use of sonar for this project. We’ve tried a few different versions of the action code so this suggests a code change of ours has triggered this but there’s just no information to go on here as to how we can solve for this. We can try to brute force detect the change however the code is fine and in production so not incorrect.

Thanks,

David.

Hey @davidwbt

I gave our team a ping to come and investigate. In the meantime you might try excluding the file if it’s blocking your pipelines.

Hello @davidwbt, thanks for your report!

This is indeed a problem on the analyzer side. We also noticed it in our testing, but unfortunately already after the last version has been released. I’m happy to share that the fix is already implemented and should appear on SonarQube Cloud probably this week. Until then, as Colin suggested, you can exclude the problematic file from the analysis.

Best,

Peter

1 Like

So when will this fix be released? Our pipelines have had coverage turned off for a week already. :confused:

Hello @davidwbt, @quynhduongphl.

The fix has been released and is now deployed on SonarQube Cloud.

Apologies for the delay, this took longer than it should have.
I will discuss internally to see where we got slowed down in the process, so we don’t run into the same problems again.

Please let me know if your pipeline is running again successfully.

Best
Jonas

3 Likes

Best practice company right there :smiley: