We’re paying for 250k LOC. The last successful (automatic) scan of the repository registered 219k LOC. The next (automatic) scan after that failed due to registering 277k LOC. Reviewing the commits after the last successful scan didn’t reveal any large changes. Every place in the platform seems to only show the results of the last successful scan (the Code tab, etc) so we can’t figure out what caused a 60k jump in LOC. We only have one branch added to Sonarcloud. How can we figure out what the issue is?
Normally I would tell you to look at your analysis logs to see if you noticed any changes there. They’re not fulsome, but you can often get an idea. But with automatic analysis, you don’t have access to those logs.
The things to check:
your .sonarcloud.properties file: did the analysis scope change there? (Expanded sonar.sources, dropped exclusions, etc)
your project settings in the UI: were exclusions dropped or edited (unfortunately, there’s no changelog, so you’ll have to rely a bit on memory)
One of my teams just had a sudden increase of 200k+ lines of code scanning. Upon investigation, it seems that the default “Sonar way” quality profile changed (not by us, but by SonarCloud), causing more files to be scanned than they were previously. It caused the scanner to pick up a JSON file that was quite large in our case.
Does SonarCloud have any comments on this change that was applied (apparently on July 20, 2023)? I don’t recall seeing any notice about changes like this that could affect LoC.
In fact, another user recently reported new errors related to JS/TS analysis of JSON files. And that’s what your screenshot show: a change in the TypeScript Quality Profile. I’ve redirected this away from the SonarCloud team and to the language team.
In the meantime, you can get past this by adding an exclusion, either in your properties file or via the UI, for **/*.json.
If the change was saved in your configuration (a Save button appears when you start editing) then the best thing you can do is start whittling down the files in the analysis scope, adding directory by directory to the exclusion list until you hit pay dirt. Then you can examine the files in the directory whose exclusion finally allowed analysis to succeed.
I want to share some updates: We just released a new JS/TS analyzer version that will be deployed in SonarCloud in the coming days. This new version includes a fix to prevent JSON files from being picked during analysis.
Please let me know if your problem persists despite the fix.