Slowness of Sonarqube Analysis on lts-community Edition 8.9.8

Hi, we are facing some slowness issue regarding our new sonarqube server running the latest lts community version 8.9.8. with postgresql 12.12. Actually the analysis duration have been doubled comapring to our previous server on community version 7.9 and Postgresql 12.4

We’ve followed the recommendations about sonarqube memory tuning on this page

But nothing was better.

The project we are using for this test is 1.6M Lines of code; Gradle project using scanner version "org.sonarqube" version "3.2.0"

Here are our VM and Java specifications:

  • 8GB RAM
  • 2 vCPUs
  • sonar.search.javaOpts=-Xmx3G -Xms3G
  • sonar.ce.javaOpts=-Xmx3G -Xms3G
  • sonar.web.javaOpts=-Xmx2G

Can you please help and tell us what are we possibly missing ?
Thank You.

Have you done a VACUUM lately?

https://community.sonarsource.com/t/sonarqube-becomes-slow-after-postgresql-upgrade/37925/3

Colin It’s a fresh install and Yes VACUUM done but no change !
Here are some logs

2022.09.23 20:19:10 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Extract report | status=SUCCESS | time=4364ms
2022.09.23 20:19:11 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Persist scanner context | status=SUCCESS | time=714ms
2022.09.23 20:19:12 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Propagate analysis warnings from scanner report | status=SUCCESS | time=503ms
2022.09.23 20:19:12 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Generate analysis UUID | status=SUCCESS | time=0ms
2022.09.23 20:19:13 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Load analysis metadata | status=SUCCESS | time=1293ms
2022.09.23 20:19:13 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Initialize | status=SUCCESS | time=0ms
2022.09.23 20:19:16 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Build tree of components | components=4909 | status=SUCCESS | time=2658ms
2022.09.23 20:19:16 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Validate project | status=SUCCESS | time=7ms
2022.09.23 20:19:16 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Load quality profiles | status=SUCCESS | time=319ms
2022.09.23 20:19:16 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Load Quality gate | status=SUCCESS | time=14ms
2022.09.23 20:19:16 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Load new code period | status=SUCCESS | time=7ms
2022.09.23 20:19:16 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Detect file moves | reportFiles=4239 | dbFiles=4239 | addedFiles=0 | status=SUCCESS | time=57ms
2022.09.23 20:19:17 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Load duplications | duplications=13311 | status=SUCCESS | time=1439ms
2022.09.23 20:19:17 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Compute cross project duplications | status=SUCCESS | time=0ms
2022.09.23 20:19:19 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Compute size measures | status=SUCCESS | time=1592ms
2022.09.23 20:19:33 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Compute new coverage | status=SUCCESS | time=13817ms
2022.09.23 20:19:34 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Compute coverage measures | status=SUCCESS | time=852ms
2022.09.23 20:19:34 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Compute comment measures | status=SUCCESS | time=38ms
2022.09.23 20:19:34 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Copy custom measures | status=SUCCESS | time=10ms
2022.09.23 20:19:34 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Compute duplication measures | status=SUCCESS | time=80ms
2022.09.23 20:19:34 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Compute size measures on new code | status=SUCCESS | time=63ms
2022.09.23 20:19:34 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Compute language distribution | status=SUCCESS | time=74ms
2022.09.23 20:19:34 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Compute test measures | status=SUCCESS | time=24ms
2022.09.23 20:19:34 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Compute complexity measures | status=SUCCESS | time=48ms
2022.09.23 20:19:34 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Load measure computers | status=SUCCESS | time=2ms
2022.09.23 20:19:34 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Compute Quality Profile status | status=SUCCESS | time=24ms
2022.09.23 20:19:50 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Execute component visitors | status=SUCCESS | time=15700ms
2022.09.23 20:19:50 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Checks executed after computation of measures | status=SUCCESS | time=0ms
2022.09.23 20:19:50 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Compute Quality Gate measures | status=SUCCESS | time=28ms
2022.09.23 20:19:50 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Compute Quality profile measures | status=SUCCESS | time=13ms
2022.09.23 20:19:50 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Generate Quality profile events | status=SUCCESS | time=2ms
2022.09.23 20:19:50 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Generate Quality gate events | status=SUCCESS | time=4ms
2022.09.23 20:19:50 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Check upgrade possibility for not analyzed code files. | status=SUCCESS | time=0ms
2022.09.23 20:19:50 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Persist components | status=SUCCESS | time=123ms
2022.09.23 20:19:50 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Persist analysis | status=SUCCESS | time=5ms
2022.09.23 20:19:50 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Persist analysis properties | status=SUCCESS | time=7ms
2022.09.23 20:19:50 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Persist measures | inserts=52 | status=SUCCESS | time=45ms
2022.09.23 20:20:19 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Persist live measures | insertsOrUpdates=213893 | status=SUCCESS | time=28696ms
2022.09.23 20:20:20 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Persist duplication data | insertsOrUpdates=1818 | status=SUCCESS | time=1106ms
2022.09.23 20:20:20 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Persist new ad hoc Rules | status=SUCCESS | time=1ms
2022.09.23 20:20:20 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Persist issues | cacheSize=1 MB | inserts=1 | updates=473 | merged=0 | status=SUCCESS | time=531ms
2022.09.23 20:20:20 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Persist project links | status=SUCCESS | time=7ms
2022.09.23 20:20:20 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Persist events | status=SUCCESS | time=17ms
2022.09.23 20:20:42 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Persist sources | status=SUCCESS | time=21759ms
2022.09.23 20:20:42 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Persist cross project duplications | status=SUCCESS | time=0ms
2022.09.23 20:20:42 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Enable analysis | status=SUCCESS | time=20ms
2022.09.23 20:20:42 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Update last usage date of quality profiles | status=SUCCESS | time=6ms
2022.09.23 20:20:42 INFO  ce[AYNsAOoVzOKwZGL_kd1D][o.s.c.t.s.ComputationStepExecutor] Purge db | status=SUCCESS | time=199ms
2022.09.23 20:21:34 WARN  es[][o.e.t.ThreadPool] execution of [ReschedulingRunnable{runnable=org.elasticsearch.monitor.jvm.JvmGcMonitorService$1@56f6cbec, interval=1s}] took [5340ms] which is above the warn threshold of [5000ms]
2022.09.23 20:22:17 WARN  es[][o.e.t.ThreadPool] execution of [org.elasticsearch.indices.IndicesService$CacheCleaner@74c64b4d] took [41902ms] which is above the warn threshold of [5000ms]
2022.09.23 20:23:05 WARN  es[][o.e.t.ThreadPool] timer thread slept for [5.4s/5453ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:23:05 WARN  es[][o.e.t.ThreadPool] timer thread slept for [5.4s/5452517932ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:23:05 WARN  es[][o.e.h.AbstractHttpServerTransport] handling request [null][POST][/_bulk?timeout=1m][Netty4HttpChannel{localAddress=/127.0.0.1:9001, remoteAddress=/127.0.0.1:37610}] took [5452ms] which is above the warn threshold of [5000ms]
2022.09.23 20:24:45 WARN  es[][o.e.t.ThreadPool] execution of [ReschedulingRunnable{runnable=org.elasticsearch.monitor.jvm.JvmGcMonitorService$1@56f6cbec, interval=1s}] took [18452ms] which is above the warn threshold of [5000ms]
2022.09.23 20:24:44 WARN  es[][o.e.t.ThreadPool] timer thread slept for [6.7s/6702ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:24:48 WARN  es[][o.e.t.ThreadPool] timer thread slept for [6.7s/6701092611ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:24:51 WARN  es[][o.e.t.ThreadPool] timer thread slept for [16.2s/16245ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:24:56 WARN  es[][o.e.t.ThreadPool] timer thread slept for [16.2s/16245460920ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:24:58 WARN  es[][o.e.t.ThreadPool] timer thread slept for [6.5s/6549ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:25:01 WARN  es[][o.e.t.ThreadPool] timer thread slept for [6.5s/6549378660ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:25:05 WARN  es[][o.e.t.ThreadPool] timer thread slept for [6.4s/6455ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:25:08 WARN  es[][o.e.t.ThreadPool] timer thread slept for [6.4s/6454787368ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:25:11 WARN  es[][o.e.t.ThreadPool] timer thread slept for [6.5s/6598ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:25:14 WARN  es[][o.e.t.ThreadPool] timer thread slept for [6.5s/6598199376ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:25:19 WARN  es[][o.e.t.ThreadPool] timer thread slept for [7.5s/7595ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:25:24 WARN  es[][o.e.t.ThreadPool] timer thread slept for [7.5s/7595124493ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:25:25 WARN  es[][o.e.t.ThreadPool] execution of [ReschedulingRunnable{runnable=org.elasticsearch.monitor.jvm.JvmGcMonitorService$1@56f6cbec, interval=1s}] took [20648ms] which is above the warn threshold of [5000ms]
2022.09.23 20:25:26 WARN  es[][o.e.m.f.FsHealthService] health check of [/opt/sonarqube/data/es7/nodes/0] took [14193ms] which is above the warn threshold of [5s]
2022.09.23 20:25:29 WARN  es[][o.e.t.ThreadPool] timer thread slept for [8.1s/8160ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:25:32 WARN  es[][o.e.t.ThreadPool] timer thread slept for [8.1s/8159246471ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:25:52 WARN  es[][o.e.t.ThreadPool] timer thread slept for [7.1s/7152ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:25:55 WARN  es[][o.e.t.ThreadPool] timer thread slept for [7.1s/7152529846ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:26:01 WARN  es[][o.e.t.ThreadPool] timer thread slept for [26.2s/26297ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:26:08 WARN  es[][o.e.t.ThreadPool] execution of [ReschedulingRunnable{runnable=org.elasticsearch.monitor.jvm.JvmGcMonitorService$1@56f6cbec, interval=1s}] took [33449ms] which is above the warn threshold of [5000ms]
2022.09.23 20:26:27 WARN  es[][o.e.t.ThreadPool] timer thread slept for [26.2s/26296709489ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:26:33 WARN  es[][o.e.t.ThreadPool] timer thread slept for [31.4s/31407ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:26:41 WARN  es[][o.e.t.ThreadPool] timer thread slept for [31.4s/31406992345ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:26:48 WARN  es[][o.e.t.ThreadPool] timer thread slept for [16.2s/16237ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:26:55 WARN  es[][o.e.t.ThreadPool] timer thread slept for [16.2s/16236898920ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:27:27 WARN  es[][o.e.t.ThreadPool] timer thread slept for [38.6s/38608ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:27:38 WARN  es[][o.e.t.ThreadPool] execution of [ReschedulingRunnable{runnable=org.elasticsearch.monitor.jvm.JvmGcMonitorService$1@56f6cbec, interval=1s}] took [54845ms] which is above the warn threshold of [5000ms]
2022.09.23 20:27:40 WARN  es[][o.e.t.ThreadPool] timer thread slept for [38.6s/38608437063ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:27:52 WARN  es[][o.e.t.ThreadPool] timer thread slept for [25s/25046ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:27:59 WARN  es[][o.e.t.ThreadPool] timer thread slept for [25s/25045258398ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:29:56 WARN  es[][o.e.t.ThreadPool] timer thread slept for [2m/124546ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:33:48 WARN  es[][o.e.t.ThreadPool] execution of [ReschedulingRunnable{runnable=org.elasticsearch.monitor.jvm.JvmGcMonitorService$1@56f6cbec, interval=1s}] took [124546ms] which is above the warn threshold of [5000ms]
2022.09.23 20:33:56 WARN  es[][o.e.t.ThreadPool] timer thread slept for [2m/124546416037ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:37:02 WARN  es[][o.e.t.ThreadPool] timer thread slept for [7m/425551ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:37:20 WARN  es[][o.e.t.ThreadPool] timer thread slept for [7m/425362632849ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:37:29 WARN  es[][o.e.t.ThreadPool] timer thread slept for [27s/27051ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:37:57 WARN  es[][o.e.t.ThreadPool] timer thread slept for [27.2s/27239239459ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:38:10 WARN  es[][o.e.t.ThreadPool] timer thread slept for [39.9s/39902ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:38:30 WARN  es[][o.e.t.ThreadPool] timer thread slept for [39.9s/39902252507ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:38:36 WARN  es[][o.e.t.ThreadPool] execution of [ReschedulingRunnable{runnable=org.elasticsearch.monitor.jvm.JvmGcMonitorService$1@56f6cbec, interval=1s}] took [39902ms] which is above the warn threshold of [5000ms]
2022.09.23 20:38:40 WARN  es[][o.e.t.ThreadPool] timer thread slept for [29.8s/29893ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:38:51 WARN  es[][o.e.t.ThreadPool] timer thread slept for [29.8s/29893085865ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:39:07 WARN  es[][o.e.t.ThreadPool] timer thread slept for [27.5s/27500ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:39:20 WARN  es[][o.e.t.ThreadPool] timer thread slept for [27.4s/27499565734ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:39:34 WARN  es[][o.e.t.ThreadPool] timer thread slept for [26.5s/26553ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:41:12 WARN  es[][o.e.t.ThreadPool] timer thread slept for [26.5s/26553757892ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:41:20 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1.7m/106050ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:43:28 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1.7m/106049266813ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:44:46 WARN  es[][o.e.t.ThreadPool] timer thread slept for [3.4m/206005ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:45:09 WARN  es[][o.e.t.ThreadPool] timer thread slept for [3.4m/206005363879ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:46:36 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1.8m/109495ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:47:37 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1.8m/109495032247ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:47:35 WARN  es[][o.e.m.f.FsHealthService] health check of [/opt/sonarqube/data/es7/nodes/0] took [315501ms] which is above the warn threshold of [5s]
2022.09.23 20:52:21 WARN  es[][o.e.t.ThreadPool] timer thread slept for [5.7m/342251ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:57:16 WARN  es[][o.e.t.ThreadPool] timer thread slept for [5.7m/342251066555ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:57:19 WARN  es[][o.e.t.ThreadPool] timer thread slept for [5m/303006ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:57:50 WARN  es[][o.e.t.ThreadPool] timer thread slept for [5m/303005765014ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:57:53 WARN  es[][o.e.t.ThreadPool] timer thread slept for [32.4s/32443ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:57:54 WARN  es[][o.e.t.ThreadPool] timer thread slept for [32.4s/32443411453ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:58:04 WARN  es[][o.e.t.ThreadPool] execution of [ReschedulingRunnable{runnable=org.elasticsearch.monitor.jvm.JvmGcMonitorService$1@56f6cbec, interval=1s}] took [45139ms] which is above the warn threshold of [5000ms]
2022.09.23 20:58:46 WARN  es[][o.e.t.ThreadPool] timer thread slept for [15.9s/15900ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:58:47 WARN  es[][o.e.t.ThreadPool] timer thread slept for [15.8s/15899980487ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:58:50 WARN  es[][o.e.t.ThreadPool] timer thread slept for [6.2s/6254ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:59:01 WARN  es[][o.e.t.ThreadPool] timer thread slept for [6.2s/6254329211ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 20:59:52 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1m/62901ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 20:59:52 WARN  es[][o.e.t.ThreadPool] execution of [ReschedulingRunnable{runnable=org.elasticsearch.monitor.jvm.JvmGcMonitorService$1@56f6cbec, interval=1s}] took [26391ms] which is above the warn threshold of [5000ms]
2022.09.23 21:00:14 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1m/62901530284ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:01:46 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1.3m/79200ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:02:03 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1.3m/79199576839ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:02:22 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1.1m/68596ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:03:06 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1.1m/68595703233ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:03:21 WARN  es[][o.e.t.ThreadPool] timer thread slept for [59.1s/59101ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:03:12 WARN  es[][o.e.m.f.FsHealthService] health check of [/opt/sonarqube/data/es7/nodes/0] took [68596ms] which is above the warn threshold of [5s]
2022.09.23 21:03:34 WARN  es[][o.e.t.ThreadPool] timer thread slept for [59.1s/59100893379ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:03:37 WARN  es[][o.e.t.ThreadPool] execution of [ReschedulingRunnable{runnable=org.elasticsearch.monitor.jvm.JvmGcMonitorService$1@56f6cbec, interval=1s}] took [127696ms] which is above the warn threshold of [5000ms]
2022.09.23 21:03:48 WARN  es[][o.e.t.ThreadPool] timer thread slept for [27.6s/27608ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:03:49 WARN  es[][o.e.t.ThreadPool] timer thread slept for [27.6s/27608849708ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:04:04 WARN  es[][o.e.t.ThreadPool] timer thread slept for [16.2s/16292ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:04:27 WARN  es[][o.e.t.ThreadPool] timer thread slept for [16.2s/16291044780ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:05:02 WARN  es[][o.e.t.ThreadPool] timer thread slept for [57.3s/57399ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:05:15 WARN  es[][o.e.t.ThreadPool] timer thread slept for [57.3s/57399485219ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:08:40 WARN  es[][o.e.t.ThreadPool] timer thread slept for [34.6s/34657ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:09:01 WARN  es[][o.e.t.ThreadPool] timer thread slept for [34.6s/34657466697ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:10:48 WARN  es[][o.e.t.ThreadPool] timer thread slept for [5.1m/310854ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:13:36 WARN  es[][o.e.t.ThreadPool] timer thread slept for [5.1m/310853647598ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:14:51 WARN  es[][o.e.t.ThreadPool] execution of [ReschedulingRunnable{runnable=org.elasticsearch.monitor.jvm.JvmGcMonitorService$1@56f6cbec, interval=1s}] took [310853ms] which is above the warn threshold of [5000ms]
2022.09.23 21:14:55 WARN  es[][o.e.t.ThreadPool] timer thread slept for [4.1m/246490ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:16:20 WARN  es[][o.e.t.ThreadPool] timer thread slept for [4.1m/246489387004ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:16:49 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1.9m/114863ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:20:32 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1.9m/114863141357ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:20:55 WARN  es[][o.e.t.ThreadPool] timer thread slept for [4.1m/246387ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:21:01 WARN  es[][o.e.t.ThreadPool] timer thread slept for [4.1m/246387045607ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:21:07 WARN  es[][o.e.t.ThreadPool] timer thread slept for [8s/8047ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:21:19 WARN  es[][o.e.t.ThreadPool] timer thread slept for [8s/8047481885ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:21:52 WARN  es[][o.e.t.ThreadPool] timer thread slept for [35.9s/35913ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:21:58 WARN  es[][o.e.m.f.FsHealthService] health check of [/opt/sonarqube/data/es7/nodes/0] took [43961ms] which is above the warn threshold of [5s]
2022.09.23 21:22:13 WARN  es[][o.e.t.ThreadPool] timer thread slept for [35.9s/35913171608ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:23:38 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1.9m/119690ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:24:08 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1.9m/119689394256ns] on relative clock which is above the warn threshold of [5000ms]
2022.09.23 21:24:42 WARN  es[][o.e.t.ThreadPool] timer thread slept for [1m/63021ms] on absolute clock which is above the warn threshold of [5000ms]
2022.09.23 21:24:35 WARN  es[][o.e.t.ThreadPool] execution of [ReschedulingRunnable{runnable=org.elasticsearch.monitor.jvm.JvmGcMonitorService$1@56f6cbec, interval=1s}] took [119689ms] which is above the warn threshold of [5000ms]
2022.09.23 21:31:57 WARN  web[][o.a.c.d.BasicDataSource] An internal object pool swallowed an Exception.

Nothing here looks… gravely slow for 1.6 million Lines of Code (so far as interactions with the DB server)

Right after this step:

It appears that indexing activities start.

I would suspect that something is happening on your application server (the server running SonarQube). In fact, I think it’s choking on the fact that no memory is left for the system.

3+3+2 = 8… so there is literally nothing left for your system.

I would recommend lowering the RAM settings for the different JVMs, or bumping up the RAM on the machine.