- ALM used: GitHub
- CI system used: GitHub Actions
- Languages of the repository: Typescript
- Error observed: Analysis fails due to LOC overage
We’re paying for 250k LOC. The last successful (automatic) scan of the repository registered 219k LOC. The next (automatic) scan after that failed due to registering 277k LOC. Reviewing the commits after the last successful scan didn’t reveal any large changes. Every place in the platform seems to only show the results of the last successful scan (the Code tab, etc) so we can’t figure out what caused a 60k jump in LOC. We only have one branch added to Sonarcloud. How can we figure out what the issue is?
Welcome to the community!
Normally I would tell you to look at your analysis logs to see if you noticed any changes there. They’re not fulsome, but you can often get an idea. But with automatic analysis, you don’t have access to those logs.
The things to check:
.sonarcloud.properties file: did the analysis scope change there? (Expanded
sonar.sources, dropped exclusions, etc)
- your project settings in the UI: were exclusions dropped or edited (unfortunately, there’s no changelog, so you’ll have to rely a bit on memory)
We haven’t changed anything in the UI in months and months. We’re using the defaults and don’t have a .sonarcloud.properties file.
I don’t have ready access to those logs either, so I’m going to flag this for the folks that do.
One of my teams just had a sudden increase of 200k+ lines of code scanning. Upon investigation, it seems that the default “Sonar way” quality profile changed (not by us, but by SonarCloud), causing more files to be scanned than they were previously. It caused the scanner to pick up a JSON file that was quite large in our case.
Does SonarCloud have any comments on this change that was applied (apparently on July 20, 2023)? I don’t recall seeing any notice about changes like this that could affect LoC.
In fact, another user recently reported new errors related to JS/TS analysis of JSON files. And that’s what your screenshot show: a change in the TypeScript Quality Profile. I’ve redirected this away from the SonarCloud team and to the language team.
In the meantime, you can get past this by adding an exclusion, either in your properties file or via the UI, for
We excluded **/.json in case and are still getting 277150 lines.
Can you show me where/how you excluded it?
In the Analysis Scope in the General Settings for the project (https://sonarcloud.io/project/settings?category=exclusions):
That excludes JSON files from unit test coverage metric calculation. Move that pattern to the Source File Exclusions:
Made the change. Failed with 277150 lines.
If the change was saved in your configuration (a Save button appears when you start editing) then the best thing you can do is start whittling down the files in the analysis scope, adding directory by directory to the exclusion list until you hit pay dirt. Then you can examine the files in the directory whose exclusion finally allowed analysis to succeed.
As a followup, I’m not entirely convinced that yours is a question of finding the right file to exclude. I’ve flagged this for the language experts.
Another user reported a similar issue related to JSON files being picked during analysis by the JS/TS analyzer.
Does your TSConfig enable resolveJsonModule? Do you import JSON files from TypeScript source files?
I want to share some updates: We just released a new JS/TS analyzer version that will be deployed in SonarCloud in the coming days. This new version includes a fix to prevent JSON files from being picked during analysis.
Please let me know if your problem persists despite the fix.