I am using a SonarQube docker image sonarqube:latest and it is working perfectly. But I am having a project related issue that I hope someone can please help. I am trying to scan a dot-net project that has a lot of java-script files that I wish to exclude these files as they are causing the elasticsearch heap size to max out and the project failing to display report on SonarQube due to
Fail to execute es requestES refresh request on indices ‘rules’
Kindly note that during the gitlab ci stage it is successfully passing
WARN: This may lead to missing/broken features in SonarQube
INFO: CPD Executor 190 files had no CPD blocks
INFO: CPD Executor Calculating CPD for 711 files
INFO: CPD Executor CPD calculation finished (done) | time=331ms
INFO: Analysis report generated in 1855ms, dir size=682.6 MB
INFO: Analysis report compressed in 22877ms, zip size=98.6 MB
INFO: Analysis report uploaded in 119850ms
INFO: ANALYSIS SUCCESSFUL, you can browse http://192.:9000/dashboard?id=sj4rWLW107
INFO: Note that you will be able to access the updated dashboard once the server has processed the submitted analysis report
INFO: More about the report processing at http://192.:9000/api/ce/task?id=Ao9rTu
INFO: Analysis total time: 6:23.726 s
INFO: ------------------------------------------------------------------------
INFO: EXECUTION SUCCESS
INFO: ------------------------------------------------------------------------
INFO: Total time: 7:03.914s
INFO: Final Memory: 16M/100M
INFO: ------------------------------------------------------------------------
The SonarScanner CLI has finished
09:50:47.071 Post-processing succeeded.
Saving cache for successful job 00:01
Creating cache build...
WARNING: .sonar/cache: no matching files
Archive is up to date!
Created cache
Cleaning up file based variables 00:00
Job succeeded
Kindly advise what can I add to the git-lab Ci sonarqube stage to exclude files from the scanning process. I tried this command in script but it doesn’t work sonar.exclusions=**/*.Scripts
These things should be entirely unrelated. The first thing to do is get your SonarQube instance up and stable. And then troubleshoot analysis if needed.
Kindly note that my SonarQube instance is up and stable for most of my projects. This issue happens for a specific project that happens to have numerous .js files, which are causing elasticsearch to flood.
After the SonarScanner CLI has finished on gitlab CI with successful execution, my project (which has a lot of .js files) is causing these errors
Error Details from Background Tasks on SonarQube UI:
Error Details
java.lang.OutOfMemoryError: Java heap space
SonarQube Logs:
> Exception: java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "I/O dispatcher 2"
Exception in thread "I/O dispatcher 5" 2022.06.07 07:20:46 INFO ce[AYE9BC_UKT63][o.s.c.t.s.ComputationStepExecutor] Persist sources | status=FAILED | time=74589ms
java.lang.OutOfMemoryError: Java heap space
at java.base/java.util.Collections$UnmodifiableCollection.iterator(Collections.java:1043)
at org.apache.http.impl.nio.reactor.BaseIOReactor.validate(BaseIOReactor.java:210)
at org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:280)
at org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104)
at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:591)
at java.base/java.lang.Thread.run(Thread.java:829)
2022.06.07 07:20:46 ERROR ce[][o.a.h.i.n.c.InternalHttpAsyncClient] I/O reactor terminated abnormally
org.apache.http.nio.reactor.IOReactorException: I/O dispatch worker terminated abnormally
at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor.execute(AbstractMultiworkerIOReactor.java:359)
at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.execute(PoolingNHttpClientConnectionManager.java:221)
at org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase$1.run(CloseableHttpAsyncClientBase.java:64)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.OutOfMemoryError: Java heap space
2022.06.07 07:20:46 INFO ce[AYE9BC_UKTu3vgzujD63][o.s.c.t.p.a.p.PostProjectAnalysisTasksExecutor] Webhooks | globalWebhooks=0 | projectWebhooks=0 | status=SUCCESS | time=51ms
2022.06.07 07:20:47 ERROR ce[AYE9BC_UKTjD63][o.s.c.t.CeWorkerImpl] Failed to execute task AYE9BC_UKTjD63
java.lang.OutOfMemoryError: Java heap space
2022.06.07 07:20:54 INFO ce[AYE9BC_UKujD63][o.s.c.t.CeWorkerImpl] Executed task | project=sjD6q | type=REPORT | id=AYE9BC_ujD63 | submitter=admin | status=FAILED | time=356805ms
2022.06.07 07:20:59 WARN es[][o.e.c.r.a.DiskThresholdMonitor] flood stage disk watermark [95%] exceeded on [8iJCQvUx25A][sonarqube][/opt/sonarqube/data/es7/nodes/0] free: 11.9gb[2.7%], all indices on this node will be marked read-only
After debugging my way out of this problem the only solution for my issue is to add optional parameters for my project configuration on gitlab ci by looking at this documentation but I am unable to find a command that excludes certain type of files.
Are you using the model where you spin up a brand new instance of SonarQube for each analysis (not, BTW, what we recommend)?
Because your error indicates that the Elasticsearch process needs more heap memory. For a normal deployment, you can configure that in $SONARQUBE-HOME/conf/sonar.properties, and via envvars for Docker.
I launched a SonarQube docker container on my machine and I am using these configurations, provided when creating a project with git-lab Ci, to scan my dotnet projects. Some projects that do not contain .js files are successfully being displayed on sonarqube UI but projects that have many .js files are failing with this error. While checking the documentation i came across these parameters in order to be able to exclude the .js files from the scan but I could not find a parameter to fulfill this issue.
The last analysis failed because it would have caused your server-wide lines of code total to exceed your 500000 limit.
I’m confused. Is the complaint that background task processing fails with an OutOfMemoryError or that your license isn’t big enough?
Because the former can be addressed by increasing the memory allocation to the Compute Engine process. The latter by either bumping up your license size or excluding some files.
Please Note that these logs are from the SonarQube UI which means that the excess .js files in my project is causing exception OutOfMemoryError that is why I am asking if there is a solution, which you mentioned above, by excluding these files when running my sonarscanner on gitlab ci. Can you please refer me to a way I can exclude these files from scanning during the pipeline.