Sonar analysis stop at the first file it can not analyze

Dear community,

This bug report is related one I have reported previously (https://community.sonarsource.com/t/llvm-runs-out-of-memory-during-sonar-analysis-on-boost-include/56326), but not for the bug reported in the stack trace … instead of having only 1 bug/stacktrace reported.

It looks like sonar stops at the first occasion it runs out of memory while processing a file.
This was a problem for me as the project I tried to get sonar to analyze is quite large.
In an ideal situation I would just need to let sonar run for the night and next morning, if there were such errors I could just exclude all the file reported from the analysis at once (and check later).
Right now when I try to run sonar … it takes around 2 hours to find the next file it can not handle. And because it reports only 1 file at once … this means 2 hour iteration for each file that will need to be excluded.

Would it be possible, to change the current behavior so that when there is an out of memory error, Sonar survives it and after reporting the problem tries to continue with the next file?

Thank you,
Kristof

Hi @Kristof_Szabados,

You should be able to set up sonar.cfamily.cache to avoid reanalyzing the same files again if they didn’t change.

Now, this doesn’t solve the entire problem. You still cannot find all failures in one run, but that is rarely needed. Overnight for the analysis to finish is unexpected. Do you have a huge project? Are you using multithreading(sonar.cfamily.threads) and caching?

Thanks,

Hi @Abbas ,

Technically, I have already gone through all of the files.
It was just taking much longer than I had expected.

I believe it would be a better way of working if sone could just skip over the file it failed to analyze and continue with the next … now the tool does not feel very robust.

Thank you,

Hello @Kristof_Szabados,

Failing and stopping the analysis on the first error is intentional. Let me explain:

When multiple files fail, in most cases the failures are related. Failure of the analyzer is rare. Multiple unrelated failures is extremely rare. Suppose the file is failing because there is a C++ feature in a header file that we don’t support. This means that the analysis of every file that includes the header will fail. So if your project contains 100 files we will generate 100 reproducers. Which is overwhelming, noisy, and useless. The first failure is what we care about; we only want to look at the first failure.

If the analysis is still taking longer than what you expect with caching and multithreading enabled, then feel free to create a post and we will be happy to assist.

Thanks,

1 Like

Dear @Abbas ,

I understand your point as that might take a long time and use up lots of disc space.

Please allow me to illustrate my point of view.
I’m setting up Sonar on an existing project, where such failures are accepted (in the end that is why I’m setting up Sonar, to find and fix such issues).
If this project would contain 100 files and each sonar check finds/reports only 1 (and I have to run them nightly as they take several hours) … that would take about 100 days for me to find all such files and exclude them.

If on the other hand the analysis could be asked to run longer, but find all such files at once.
I could let it run for a weekend and exclude all reported files on the followup monday (with maybe +1 might needed to check that now everything is fine).

The difference is quite large, according to the example you mentioned 100 days vs. 1 weekend +1 workday.
(Not to mention the real life problems of reporting to management for >3 months at every daily standup, that we still can not start to use Sonar seriously because there was one more file it did not like for some reason)

What do you think would it be possible to have this fixed in Sonar?
Maybe an extra option for configuration. Something like:“find and report all failing files without generating reproducers”?
(please note that actually going deep and analyzing why each file fails is a different topic on its own. For example that could be checked by several team members in parallel, by each checking a different file … but the initial discovery is best done by a single person to forgo the possibility of several people running sonar on the same code, just for it to report the same file failing)

Thank you in advance,
Kristof

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.