SonarQube fails the bootstrap max file descriptors check on Google Cloud Run

Hi. I’m trying to set up SonarQube Developer Edition server. It runs on Google Cloud Run as docker image.

SonarQube fails to start due to bootstrap max file descriptors check.

bootstrap check failure [1] of [1]: max file descriptors [25000] for elasticsearch process is too low, increase to at least [65535]; for more information see [https://www.elastic.co/guide/en/elasticsearch/reference/8.13/_file_descriptor_check.html]

I tried using ulimit -n 65535 and setting nofile to 65535 in /etc/security/limits.conf and get an error cannot modify limit: Operation not permitted. As far as I could find, Cloud Run doesn’t allow to change the limit on the number of open files.
Currently I have to use the -Dsonar.es.bootstrap.checks.disable=true flag to disable the bootstrap check.
The code analysis takes about 4 minutes for a small number of lines of code. I suspect this may be the reason.

Is there any other way to solve the max file descriptors problem?

Hey there.

It looks like Google Cloud Run indeed has a max file descriptor count of 25000. What that tells me is that it’s not the right environment to run a SonarQube server in.

If I understand it correctly, Google Cloud Run is really meant to run your application code, not vendor-provided software like SonarQube.

Which part is taking a long time? The scanner-side (in your CI/CD), or the server-side processing?

Thank you for the reply.
Too bad if Cloud Run is not suitable for SonarQube.

Sonar analysis is slow on the server side. It takes about 2-4 minutes even if the number of lines is small (from 0 to 50).


Is this the expected time?