Hi, I’m trialing SonarCloud with a big repository ~ 1.2 mil lines of code. This is already post-exclusions and all sorts of filtering the unwanted files.
When I run the Automatic Analysis (integration w/ GitHub) - it always times out due to memory. I decided to run the initial analysis on my computer, push it to sonar and then turn the Automatic Analysis on again (for PRs etc.). The problem is next time there is a code change in the repository, it runs the full analysis again and times out.
Is there a way to tell Sonar to use the “cache” from my previous analysis I ran locally?
Or does our current setup mean we cannot use the automatic analysis?