Hey Carsten.
Yes, all these rules are advanced vulnerability detection rules that use a fairly significant amount of memory compared to “normal” rules…
You can continue to increase the amount of memory given to the scanner – but we are also regularly improving our security analysis to be more efficient/performant. Some of those changes are coming in the upcoming LTA release of SonarQube Server.
You can learn more about what this mechanism does here.
It’s not something that adjusts based on how much memory is given to the analysis. It basically makes sure that the analysis will not, as @Malte says, explode exponentially. I guess infinity doesn’t matter when you’re memory limited (and analysis will just crash) – but I’m not even sure it’s the high simulation costs that are exhausting the memory available.
I think I’ve covered this topic well, but I’ll ping our experts to see if there’s anything to investigate further. They will probably be more interested to see what memory utilization looks like after you’re using the version packed with our next LTA.