Sonar-scanner killed when analyzing large .sql file with 100k+ LOC

Hi @janos the analysis fails at a .sql (trulioo.sql file from the error logs above) file that has 100,000+ LOC. When I exclude the particular file from the analysis the scanner passes without any memory crashes.

Do you see any way to overcome this?

The languages used in this project are JavaScript, CSS, PHP, TypeScript

This is my pipeline:

clone:
depth: full

definitions:

services:
docker:
memory: 3072

steps:
- step: &build-test-sonarcloud
size: 2x
caches:
- node
script:
- pipe: sonarsource/sonarcloud-scan:1.2.0
variables:
DEBUG: ‘true’
SONAR_SCANNER_OPTS: -Xmx7168m
SONAR_TOKEN: ${SONAR_TOKEN}
EXTRA_ARGS: -Dsonar.host.url=https://sonarcloud.io #-Dsonar.exclusions=**/trullio.sql
- step: &check-quality-gate-sonarcloud
name: Check the Quality Gate on SonarCloud
script:
- pipe: sonarsource/sonarcloud-quality-gate:0.1.4
variables:
DEBUG: ‘true’
SONAR_SCANNER_OPTS: -Xmx1024m
SONAR_TOKEN: ${SONAR_TOKEN}
EXTRA_ARGS: -Dsonar.host.url=https://sonarcloud.io

pipelines:
default:
- step: *build-test-sonarcloud
- step: *check-quality-gate-sonarcloud