Sonarcloud all of a sudden scanning every file

We have been using sonarcloud for around 10 months now with no issues. We use the Github Action called Sonarcloud Github Action within our CICD. On each commit that gets pushed to an open PR, we run this step as such:

      - name: SonarCloud Scan
        if: ${{ github.actor != 'dependabot[bot]' }}
        uses: sonarsource/sonarcloud-github-action@master
        env:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}

I compared the logs for the PR’s that I am referring to vs Actions we had just last week to make sure there wasnt a version change. Both (both meaning successful Action from last week and Action from today) PR’s are using the same version: SonarScanner 4.8.0.2856

However, the difference seems to be all of a sudden after 10 months its trying to scan every single file, which after some time it just runs out of memory. I’ll post logs (going to only include a few lines so I can redact file names I don’t want public, but you can trust that its been scanning every file based on the file number):

INFO: Sensor JaCoCo XML Report Importer [jacoco] (done) | time=2ms
INFO: Sensor JavaScript/TypeScript analysis [javascript]
INFO: 279 source files to be analyzed
INFO: 2/279 files analyzed, current file: /github/workspace/lambdas/redacted
INFO: 3/279 files analyzed, current file: /github/workspace/lambdas/redacted
INFO: 4/279 files analyzed, current file: /github/workspace/lambdas/redacted

...

INFO: 102/279 files analyzed, current file: /github/workspace/lambdas/redacted
INFO: 102/279 files analyzed, current file: /github/workspace/lambdas/redacted
ERROR: 
ERROR: <--- Last few GCs --->
ERROR: 
ERROR: [63:0x7fc7972252d0]   878745 ms: Mark-sweep (reduce) 2043.9 (2083.5) -> 2043.5 (2084.0) MB, 214.4 / 0.0 ms  (+ 1213.2 ms in 198 steps since start of marking, biggest step 24.7 ms, walltime since start of marking 1776 ms) (average mu = 0.312, current mu = [63:redacted]   881153 ms: Mark-sweep (reduce) 2044.6 (2084.0) -> 2043.4 (2084.3) MB, 2404.9 / 0.0 ms  (average mu = 0.158, current mu = 0.001) allocation failure; scavenge might not succeed
ERROR: 
ERROR: 
ERROR: <--- JS stacktrace --->
ERROR: 
ERROR: FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory

If you notice the 102/279 this is what i am referring to, in regards to it’s trying to scan every file. Here’s a sample of the successful log where it doesnt scan every file like it is above:

Sensor JaCoCo XML Report Importer [jacoco]
INFO: 'sonar.coverage.jacoco.xmlReportPaths' is not defined. Using default locations: target/site/jacoco/jacoco.xml,target/site/jacoco-it/jacoco.xml,build/reports/jacoco/test/jacocoTestReport.xml
INFO: No report imported, no coverage information will be imported by JaCoCo XML Report Importer
INFO: Sensor JaCoCo XML Report Importer [jacoco] (done) | time=3ms
INFO: Sensor JavaScript analysis [javascript]
INFO: Creating TypeScript program
INFO: TypeScript configuration file /github/workspace/.scannerwork/.sonartmp/redacted.tmp
INFO: 273 source files to be analyzed
INFO: Creating TypeScript program (done) | time=6881ms
INFO: Starting analysis with current program
INFO: 5/273 files analyzed, current file: /github/workspace/lambdas/redacted
INFO: Analyzed 273 file(s) with current program
INFO: 273/273 source files have been analyzed

I haven’t changed anything since I created this step in our CICD 10 months ago. Just curious what could have happened and what I can do to resolve. I’m confused on what the issue is, or even where to start to figure out how to fix. Google hasn’t really had anything happen in the last week or month from the results I looked at, so I don’t think its something widespread. Can’t imagine what could have changed though since we’ve been using this for so long with no issues.

I have tried to let it just run, I thought maybe some cache got wiped or something happened where it cleared previous data somehow. So maybe running it once would make future actions fast again, but trying this out I saw that the 2nd Action took just as long and same behavior happens.

Hi @jordmax12,

thanks for your feedback. Indeed last changes seems to have affected projects with several tsconfigs. Can you please confirm that’s your case?

One workaround until we provide a fix is to provide the sonar property sonar.typescript.tsconfigPath pointing to the base tsconfig in the project root. Let me know if that helps in your case.

Cheers,
Victor

Hi @jordmax12,

what kind of project you are working on? react? angular? are you using nx?

Thanks,
Victor

Hi Victor,

Just a monorepo of AWS lambda functions running on Node@16. Not using any bundling or TS config. Any recommendations?

did you try the workaround I proposed?

Cheers

I havent because we do not have a tsconfig at all, we dont use typescript. Only javascript.

Hello @jordmax12,

can you please enable debug logs (-X parameter) and share them? You can also send them to me via PM.

Thanks,
Victor

Hi Victor,

We are using the github action found here, so I am unable to pass that param.

I did find some documentation here that says to set RUNNER_DEBUG to 1 in the env, which I did, but the logs look the same.

Any advice on how to proceed to get the debug working? More than happy to send you logs. Sorry for all this.

Hi @jordmax12,

you can see how to set the sonar properties on the main page of the github action:

jobs:
  sonarcloud:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v3
      with:
        # Disabling shallow clone is recommended for improving relevancy of reporting
        fetch-depth: 0
    - name: SonarCloud Scan
      uses: sonarsource/sonarcloud-github-action@master
      env:
        GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
        SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
      with:
        projectBaseDir: my-custom-directory
        args: >
          -Dsonar.organization=my-organization
          -Dsonar.projectKey=my-projectkey
          -Dsonar.python.coverage.reportPaths=coverage.xml
          -Dsonar.sources=lib/
          -Dsonar.test.exclusions=tests/**
          -Dsonar.tests=tests/
          -Dsonar.verbose=true

From that list of sonar properties, you will only need to use -Dsonar.verbose=true.

Please, keep me posted.

Victor

Hi @jordmax12,

As a workaround, can you please create a tsconfig.sonar.json in the root of your project with the following contents?

{
    compilerOptions: {
      allowJs: true,
      noImplicitAny: true,
    },
  }

Then, set the property sonar.typescript.tsconfigPath=tsconfig.sonar.json in the analysis.

Please let me know if that helps.

Cheers,
Victor

This seems to have worked! I also did what you said to run in debug. But not sure thats necessary anymore as I don’t think it’s having the same issue as before. Thank you!

Hi @jordmax12,

Happy to know the workaround works for you.

indeed, no need for the debug logs. We already know the issue with the new release. Hopefully fixed next week.

Cheers,
Victor