Reported LOC doubling, analysis refuses to complete

We use SonarQube cloud with GithubActions.

We pay for a licence for 500k LOC.

We have multiple projects (4) and the total reported LOC for these projects is 322k.

As of this morning, SonarQube is refusing to complete the analysis of the main branch of one of our projects (the largest one).

The error message is:

`This analysis will make your organization to reach the maximum allowed lines limit (having 680954 lines)…`

We have not all of a sudden added 300k + LOC to any of our projects, so not sure where this limitation is coming from. We’ve also not changed the configuration to the best of my knowledge.

There’s not much in the way of out of the box logging available in SonarQube cloud from what I can see.

I have diffed the context for our last successful run and our first failing run. It looks like:

diff -w first_failure.txt last_success.txt
2a3
>   - JaCoCo 1.3.0.1538 (jacoco)
46c47
<   - sonar.analysisUuid=3fe26206-c5bc-4964-ae6f-eb79b0bde324
---
>   - sonar.analysisUuid=e01878df-cada-4862-ace8-eda8b76c8957
58c59
<   - sonar.projectBaseDir=/tmp/clone17254936284793357448
---
>   - sonar.projectBaseDir=/tmp/clone16045292000730088399
64,65c65,66
<   - sonar.scanner.bootstrapStartTime=1765292318839
<   - sonar.scanner.engineJarPath=/opt/sonar-scanner/.sonar/cache/e28161b2802df2156618d64882119fb07bf4c23768b7cbdddb53e27a89f9a26f/sonarcloud-scanner-engine-12.11.0.2292-all.jar
---
>   - sonar.scanner.bootstrapStartTime=1764951552016
>   - sonar.scanner.engineJarPath=/opt/sonar-scanner/.sonar/cache/807fa690a83fbbf8003c1af254109d23e6135538ad6d84577efcd71a31073c98/sonarcloud-scanner-engine-12.10.0.1268-all.jar
80c81
<   - sonar.working.directory=/tmp/scanner/15395449486571522141/.scannerwork
---
>   - sonar.working.directory=/tmp/scanner/2943735112328820112/.scannerwork

Jacoco plugin missing from first failure, and looks like engine upgrade in between last successful and first failure.

Has there been a change in behaviour between 12.10 and 12.11 that we might be falling foul of?

Is there a way in sonarqube cloud to log what lines it thinks it’s adding to the project rather than just refusing?

Thanks,

Douglas

Hi Douglas,

Unfortunately, there’s no way to get LOC logging out of analysis. What you can do is enable verbose analysis logging (-Dsonar.verbose=true) to get a list of the files being indexed by analysis, and check that to see if it’s suddenly including an library or two you didn’t expect.

 
HTH,
Ann

@dougiewright I seem to be running into a similar issue with our .NET and Azure Dev Ops Pipelines. Is your underlying codebase .NET or something else. I’m trying to understand if this is a .NET thing or something more fundamental. Thanks.

Python and JS here for the affected projects, no .NET

We are currently using Automatic Analysis within SonarQube cloud, we’re not running via our CI. SonarQube is tagging our PRs with its comments etc., but it’s running the analysis itself - not in a GHA runner.

I can’t figure out a way to set this param for this configuration?

Hi @dougiewright,

With automatic analysis, this gets a bit more complicated. The best approach at this point is a binary search approach with exclusions. I.e. use the UI to set up exclusions for what should be approximately half the project & see if analysis works. If it does, does it show you want you expect to see? And so on.

 
Ann

That seems like a lot of work to fix something that wasn’t broken. The diff between the last passing commit, and the first failing commit is in the order of a couple hundred lines.

We haven’t changed configuration.

Is there anything in the newer build of sonar that could explain it thinking that 300k lines have appeared out of nowhere?

Hi all,

It turns out we’ve been slowly rolling out a new analyzer that scans JSON and YAML for secrets. It’s quite possible it was turned on for your orgs recently. Would you mind posting your Org IDs so we can dig into the data on our side? ?

@groogiam you would see the footprint of this scanner in your analysis logs. (@dougiewright, you obviously don’t have access to your analysis logs.)

You can disable this in the project Administration → General Settings → Languages → JSON → Activate JSON file analysis. And the same for YAML.

Sorry for the confusion. It’s on my list to get a reliable way to look up recent changes to SonarQube Cloud. :frowning:

 
Ann

I turned off these settings and reran the build on my commit from Dec 4. This seems to have mostly fixed the line count issue. There is still a discrepancy of ~1300 lines of code between the two runs despite having the same number of files analyzed between the old and new run. That’s less than 1% so we can live with that. Thank you for your help getting this figured out.

If I still want JSON file analysis can I just turn the JSON scanning functionality back on for the project and exclude certain directories using sonar.exclusions that have JSON I don’t want to to have the lines counted for?

Thanks.

Hi,

Yes, this exactly.

 
Ann

1 Like

This seems to have sorted the problem - thanks!

Douglas

1 Like