Here is the analysis from sonarcloud with stackoverflow during analysis of JFlex generaled Java lexer:
There are previous sucessful builds and lexer has not been changed since, so unclear why this happened.
Guess increasing stack size for the sonarcube process can handle this, but not sure if this is possible and feels like SQ should detect/handle such things by itself in a nice way, not just by throwing SOE
Nice catch! This is indeed unexpected. The issue comes from a bug in the implementation of rule java:S1155, which fails in presence of very big concatenations (or chain of binary operators).
As a temporary workaround, you can disable the rule or skip your (very big) file PerlLexer.java.