SonarQube does not starts after installing 7.9.1 version on Centos OS

Currently i am installing SonarQube version 7.9.1 on Centos operating system.
Below are the logs which i am getting after starting the service.

2019.08.22 07:28:38 DEBUG app[][i.netty.util.NetUtil] /proc/sys/net/core/somaxconn: 128
2019.08.22 07:28:38 DEBUG app[][i.n.c.DefaultChannelId] -Dio.netty.machineId: 0a:46:74:ff:fe:66:b1:18 (auto-detected)
2019.08.22 07:28:38 DEBUG app[][i.n.u.i.InternalThreadLocalMap] -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
2019.08.22 07:28:38 DEBUG app[][i.n.u.i.InternalThreadLocalMap] -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
2019.08.22 07:28:38 DEBUG app[][i.n.u.ResourceLeakDetector] -Dio.netty.leakDetection.level: simple
2019.08.22 07:28:38 DEBUG app[][i.n.u.ResourceLeakDetector] -Dio.netty.leakDetection.targetRecords: 4
2019.08.22 07:28:38 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.numHeapArenas: 8
2019.08.22 07:28:38 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.numDirectArenas: 8
2019.08.22 07:28:38 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.pageSize: 8192
2019.08.22 07:28:38 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.maxOrder: 11
2019.08.22 07:28:38 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.chunkSize: 16777216
2019.08.22 07:28:38 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.tinyCacheSize: 512
2019.08.22 07:28:38 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.smallCacheSize: 256
2019.08.22 07:28:38 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.normalCacheSize: 64
2019.08.22 07:28:38 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.maxCachedBufferCapacity: 32768
2019.08.22 07:28:38 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.cacheTrimInterval: 8192
2019.08.22 07:28:38 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.useCacheForAllThreads: true
2019.08.22 07:28:38 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.allocator.type: pooled
2019.08.22 07:28:38 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.threadLocalDirectBufferSize: 0
2019.08.22 07:28:38 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.maxThreadLocalCharBufferSize: 16384
2019.08.22 07:28:38 INFO  app[][o.e.c.t.TransportClientNodesService] failed to get node info for {#transport#-1}{ax1OxC1ZQmWz1dGLaWuyBQ}{127.0.0.1}{127.0.0.1:9001}, disconnecting...
java.lang.IllegalStateException: Future got interrupted
        at org.elasticsearch.common.util.concurrent.FutureUtils.get(FutureUtils.java:60)
        at org.elasticsearch.action.support.AdapterActionFuture.actionGet(AdapterActionFuture.java:34)
        at org.elasticsearch.transport.ConnectionManager.internalOpenConnection(ConnectionManager.java:209)
        at org.elasticsearch.transport.ConnectionManager.openConnection(ConnectionManager.java:80)
        at org.elasticsearch.transport.TransportService.openConnection(TransportService.java:367)
        at org.elasticsearch.client.transport.TransportClientNodesService$SimpleNodeSampler.doSample(TransportClientNodesService.java:411)
        at org.elasticsearch.client.transport.TransportClientNodesService$NodeSampler.sample(TransportClientNodesService.java:362)
        at org.elasticsearch.client.transport.TransportClientNodesService.addTransportAddresses(TransportClientNodesService.java:201)
        at org.elasticsearch.client.transport.TransportClient.addTransportAddress(TransportClient.java:342)
        at org.sonar.application.es.EsConnectorImpl$MinimalTransportClient.<init>(EsConnectorImpl.java:108)
        at org.sonar.application.es.EsConnectorImpl.buildTransportClient(EsConnectorImpl.java:89)
        at org.sonar.application.es.EsConnectorImpl.getTransportClient(EsConnectorImpl.java:74)
        at org.sonar.application.es.EsConnectorImpl.getClusterHealthStatus(EsConnectorImpl.java:61)
        at org.sonar.application.process.EsManagedProcess.checkStatus(EsManagedProcess.java:88)
        at org.sonar.application.process.EsManagedProcess.checkOperational(EsManagedProcess.java:73)
        at org.sonar.application.process.EsManagedProcess.isOperational(EsManagedProcess.java:58)
        at org.sonar.application.process.ManagedProcessHandler.refreshState(ManagedProcessHandler.java:201)
        at org.sonar.application.process.ManagedProcessHandler$EventWatcher.run(ManagedProcessHandler.java:258)
Caused by: java.lang.InterruptedException: null
        at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1343)
        at org.elasticsearch.common.util.concurrent.BaseFuture$Sync.get(BaseFuture.java:251)
        at org.elasticsearch.common.util.concurrent.BaseFuture.get(BaseFuture.java:94)
        at org.elasticsearch.common.util.concurrent.FutureUtils.get(FutureUtils.java:57)
        ... 17 common frames omitted
2019.08.22 07:28:38 DEBUG app[][o.s.a.e.EsConnectorImpl] Connected to Elasticsearch node: [127.0.0.1:9001]
Thread, Wrapper-Shutdown-Hook, handling the shutdown process.
shutdownJVM(0) Thread:Wrapper-Shutdown-Hook
Send a packet STOPPED : 0
read a packet STOPPED : 0
JVM signalled that it was stopped.
Closing socket.
socket read no code (closed?).
server listening on port 32001.
Wrapper Manager: ShutdownHook complete
Send a packet START_PENDING : 5000
WrapperSimpleApp: start(args) end.  Main Completed=false, exitCode=null
WrapperListener.start runner thread stopped.
returned from WrapperListener.start()
Send a packet STARTED :
Server daemon shut down
JVM exited normally.
Signal trapped.  Details:
  signal number=17 (SIGCHLD), source="unknown"
Received SIGCHLD, checking JVM process status.
JVM process exited with a code of 0, leaving the wrapper exit code set to 0.
<-- Wrapper Stopped
Spawning intermediate process...
Spawning daemon process...
--> Wrapper Started as Daemon
Using tick timer.
server listening on port 32000.
Command[0] : java
Command[1] : -Dsonar.wrapped=true
Command[2] : -Djava.awt.headless=true
Command[3] : -Xms512m
Command[4] : -Xmx3072m
Command[5] : -Djava.library.path=./lib
Command[6] : -classpath
Command[7] : ../../lib/jsw/wrapper-3.2.3.jar:../../lib/common/lucene-misc-7.7.0.jar:../../lib/common/lucene-queries-7.7.0.jar:../../lib/common/lucene-queryparser-7.7.0.jar:../../lib/common/lucene-sandbox-7.7.0.jar:../../lib/common/lucene-spatial-7.7.0.jar:../../lib/common/lucene-spatial-extras-7.7.0.jar:../../lib/common/lucene-spatial3d-7.7.0.jar:../../lib/common/lucene-suggest-7.7.0.jar:../../lib/common/hppc-0.7.1.jar:../../lib/common/joda-time-2.10.1.jar:../../lib/common/t-digest-3.2.jar:../../lib/common/HdrHistogram-2.1.9.jar:../../lib/common/jna-4.5.1.jar:../../lib/common/netty-buffer-4.1.32.Final.jar:../../lib/common/netty-codec-4.1.32.Final.jar:../../lib/common/netty-codec-http-4.1.32.Final.jar:../../lib/common/netty-common-4.1.32.Final.jar:../../lib/common/netty-handler-4.1.32.Final.jar:../../lib/common/netty-resolver-4.1.32.Final.jar:../../lib/common/netty-transport-4.1.32.Final.jar:../../lib/common/commons-email-1.5.jar:../../lib/common/okhttp-3.14.2.jar:../../lib/common/commons-csv-1.4.jar:../../lib/common/sonar-classloader-1.0.jar:../../lib/common/httpcore-4.4.4.jar:../../lib/common/commons-logging-1.2.jar:../../lib/common/tomcat-annotations-api-8.5.38.jar:../../lib/common/commons-pool2-2.6.0.jar:../../lib/common/lz4-1.3.0.jar:../../lib/common/staxmate-2.0.1.jar:../../lib/common/woodstox-core-lgpl-4.4.0.jar:../../lib/common/stax2-api-3.1.4.jar:../../lib/common/jackson-databind-2.9.8.jar:../../lib/common/jackson-core-2.9.8.jar:../../lib/common/jackson-dataformat-smile-2.8.11.jar:../../lib/common/jackson-dataformat-yaml-2.8.11.jar:../../lib/common/jackson-dataformat-cbor-2.8.11.jar:../../lib/common/jopt-simple-5.0.2.jar:../../lib/common/javax.mail-1.5.6.jar:../../lib/common/okio-1.17.2.jar:../../lib/common/mybatis-3.5.1.jar:../../lib/common/sonar-check-api-7.9.1.jar:../../lib/common/jackson-annotations-2.9.8.jar:../../lib/common/activation-1.1.jar:../../lib/common/sonar-main-7.9.1.jar:../../lib/common/sonar-server-7.9.1.jar:../../lib/common/sonar-ce-7.9.1.jar:../../lib/common/sonar-ce-task-7.9.1.jar:../../lib/common/sonar-server-common-7.9.1.jar:../../lib/common/transport-6.8.0.jar:../../lib/common/sonar-ce-common-7.9.1.jar:../../lib/common/sonar-db-dao-7.9.1.jar:../../lib/common/sonar-db-migration-7.9.1.jar:../../lib/common/sonar-db-core-7.9.1.jar:../../lib/common/sonar-process-7.9.1.jar:../../lib/common/elasticsearch-6.8.0.jar:../../lib/common/transport-netty4-client-6.8.0.jar:../../lib/common/percolator-client-6.8.0.jar:../../lib/common/parent-join-client-6.8.0.jar:../../lib/common/sonar-scanner-protocol-7.9.1.jar:../../lib/common/sonar-core-7.9.1.jar:../../lib/common/sonar-ws-7.9.1.jar:../../lib/common/protobuf-java-3.7.0.jar:../../lib/common/nanohttpd-2.3.0.jar:../../lib/common/sonar-ce-task-projectanalysis-7.9.1.jar:../../lib/common/logback-classic-1.2.3.jar:../../lib/common/sonar-update-center-common-1.18.0.487.jar:../../lib/common/sonar-duplications-7.9.1.jar:../../lib/common/sonar-plugin-api-7.9.1-all.jar:../../lib/common/sonar-plugin-api-7.9.1.jar:../../lib/common/guava-18.0.jar:../../lib/common/hazelcast-client-3.8.6.jar:../../lib/common/hazelcast-3.8.6.jar:../../lib/common/log4j-to-slf4j-2.8.2.jar:../../lib/common/log4j-api-2.8.2.jar:../../lib/common/jul-to-slf4j-1.7.25.jar:../../lib/common/sonar-markdown-7.9.1.jar:../../lib/common/sonar-channel-4.1.jar:../../lib/common/slf4j-api-1.7.25.jar:../../lib/common/elasticsearch-x-content-6.8.0.jar:../../lib/common/snakeyaml-1.17.jar:../../lib/common/httpclient-4.5.2.jar:../../lib/common/commons-codec-1.12.jar:../../lib/common/commons-io-2.6.jar:../../lib/common/commons-lang-2.6.jar:../../lib/common/gson-2.8.4.jar:../../lib/common/logback-access-1.2.3.jar:../../lib/common/logback-core-1.2.3.jar:../../lib/common/diffutils-1.2.jar:../../lib/common/commons-dbutils-1.5.jar:../../lib/common/jjwt-impl-0.10.5.jar:../../lib/common/jjwt-jackson-0.10.5.jar:../../lib/common/jjwt-api-0.10.5.jar:../../lib/common/jaxb-api-2.3.0.jar:../../lib/common/tomcat-embed-core-8.5.38.jar:../../lib/common/commons-dbcp2-2.5.0.jar:../../lib/common/picocontainer-2.15.jar:../../lib/common/jbcrypt-0.4.jar:../../lib/common/elasticsearch-cli-6.8.0.jar:../../lib/common/elasticsearch-core-6.8.0.jar:../../lib/common/elasticsearch-secure-sm-6.8.0.jar:../../lib/common/lucene-core-7.7.0.jar:../../lib/common/lucene-analyzers-common-7.7.0.jar:../../lib/common/lucene-backward-codecs-7.7.0.jar:../../lib/common/lucene-grouping-7.7.0.jar:../../lib/common/lucene-highlighter-7.7.0.jar:../../lib/common/lucene-join-7.7.0.jar:../../lib/common/lucene-memory-7.7.0.jar:../../lib/sonar-application-7.9.1.jar:../../lib/sonar-shutdowner-7.9.1.jar
Command[8] : -Dwrapper.key=kR6KM013beXDomK8
Command[9] : -Dwrapper.port=32000
Command[10] : -Dwrapper.jvm.port.min=31000
Command[11] : -Dwrapper.jvm.port.max=31999
Command[12] : -Dwrapper.debug=TRUE
Command[13] : -Dwrapper.pid=23894
Command[14] : -Dwrapper.version=3.2.3
Command[15] : -Dwrapper.native_library=wrapper
Command[16] : -Dwrapper.service=TRUE
Command[17] : -Dwrapper.cpu.timeout=10
Command[18] : -Dwrapper.jvmid=1
Command[19] : org.tanukisoftware.wrapper.WrapperSimpleApp
Command[20] : org.sonar.application.App
Launching a JVM...
WrapperManager class initialized by thread: main  Using classloader: jdk.internal.loader.ClassLoaders$AppClassLoader@799f7e29
Wrapper (Version 3.2.3) http://wrapper.tanukisoftware.org
  Copyright 1999-2006 Tanuki Software, Inc.  All Rights Reserved.

Wrapper Manager: JVM #1
Running a 64-bit JVM.
Wrapper Manager: Registering shutdown hook
Wrapper Manager: Using wrapper
Load native library.  One or more attempts may fail if platform specific libraries do not exist.
Loading native library failed: libwrapper-linux-x86-64.so  Cause: java.lang.UnsatisfiedLinkError: no wrapper-linux-x86-64 in java.library.path: [./lib]
Loaded native library: libwrapper.so
Calling native initialization method.
Inside native WrapperManager initialization method
Java Version   : 11.0.3+12-LTS Java HotSpot(TM) 64-Bit Server VM
Java VM Vendor : Oracle Corporation

Control event monitor thread started.
Startup runner thread started.
WrapperManager.start(org.tanukisoftware.wrapper.WrapperSimpleApp@a7e666, args[]) called by thread: main
Communications runner thread started.
Open socket to wrapper...Wrapper-Connection
Opened Socket from 31000 to 32000
Send a packet KEY : kR6KM013beXDomK8
handleSocket(Socket[addr=/127.0.0.1,port=32000,localport=31000])
accepted a socket from 127.0.0.1 on port 31000
read a packet KEY : kR6KM013beXDomK8
Got key from JVM: kR6KM013beXDomK8
send a packet LOW_LOG_LEVEL : 1
send a packet PING_TIMEOUT : 0
send a packet PROPERTIES : (Property Values)
Start Application.
send a packet START : start
Received a packet LOW_LOG_LEVEL : 1
Wrapper Manager: LowLogLevel from Wrapper is 1
Received a packet PING_TIMEOUT : 0
PingTimeout from Wrapper is 0
Received a packet PROPERTIES : (Property Values)
Received a packet START : start
calling WrapperListener.start()
Waiting for WrapperListener.start runner thread to complete.
WrapperListener.start runner thread started.
WrapperSimpleApp: start(args) Will wait up to 2 seconds for the main method to complete.
WrapperSimpleApp: invoking main method
2019.08.22 07:39:07 INFO  app[][o.s.a.AppFileSystem] Cleaning or creating temp directory /opt/sonarqube/sonarqube-7.9.1/temp
2019.08.22 07:39:07 INFO  app[][o.s.a.es.EsSettings] Elasticsearch listening on /127.0.0.1:9001
2019.08.22 07:39:07 INFO  app[][o.s.a.ProcessLauncherImpl] Launch process[[key='es', ipcIndex=1, logFilenamePrefix=es]] from [/opt/sonarqube/sonarqube-7.9.1/elasticsearch]: /opt/sonarqube/sonarqube-7.9.1/elasticsearch/bin/elasticsearch
2019.08.22 07:39:07 INFO  app[][o.s.a.SchedulerImpl] Waiting for Elasticsearch to be up and running
2019.08.22 07:39:07 INFO  app[][o.e.p.PluginsService] no modules loaded
2019.08.22 07:39:07 INFO  app[][o.e.p.PluginsService] loaded plugin [org.elasticsearch.transport.Netty4Plugin]
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [force_merge], size [1], queue size [unbounded]
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [fetch_shard_started], core [1], max [8], keep alive [5m]
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [listener], size [2], queue size [unbounded]
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [index], size [4], queue size [200]
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [refresh], core [1], max [2], keep alive [5m]
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [generic], core [4], max [128], keep alive [30s]
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [warmer], core [1], max [2], keep alive [5m]
2019.08.22 07:39:07 DEBUG app[][o.e.c.u.c.QueueResizingEsThreadPoolExecutor] thread pool [_client_/search] will adjust queue by [50] when determining automatic queue size
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [search], size [7], queue size [1k]
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [flush], core [1], max [2], keep alive [5m]
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [fetch_shard_store], core [1], max [8], keep alive [5m]
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [management], core [1], max [5], keep alive [5m]
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [get], size [4], queue size [1k]
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [analyze], size [1], queue size [16]
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [write], size [4], queue size [200]
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [snapshot], core [1], max [2], keep alive [5m]
2019.08.22 07:39:07 DEBUG app[][o.e.c.u.c.QueueResizingEsThreadPoolExecutor] thread pool [_client_/search_throttled] will adjust queue by [50] when determining automatic queue size
2019.08.22 07:39:07 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [search_throttled], size [1], queue size [100]
Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
Error occurred during initialization of VM
Initial heap size set to a larger value than the maximum heap size
2019.08.22 07:39:07 WARN  app[][o.s.a.p.AbstractManagedProcess] Process exited with exit value [es]: 1
2019.08.22 07:39:07 INFO  app[][o.s.a.SchedulerImpl] Process[es] is stopped
2019.08.22 07:39:07 INFO  app[][o.s.a.SchedulerImpl] SonarQube is stopped
Wrapper Manager: ShutdownHook started
2019.08.22 07:39:07 DEBUG app[][o.s.a.SchedulerImpl] Stopping [ce]...
WrapperManager.stop(0) called by thread: Wrapper-Shutdown-Hook
Send a packet STOP : 0
2019.08.22 07:39:07 DEBUG app[][o.s.a.SchedulerImpl] Stopping [web]...
2019.08.22 07:39:07 DEBUG app[][o.s.a.SchedulerImpl] Stopping [es]...
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent0] -Dio.netty.noUnsafe: false
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent0] Java version: 11
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent0] sun.misc.Unsafe.theUnsafe: available
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent0] sun.misc.Unsafe.copyMemory: available
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent0] java.nio.Buffer.address: available
Startup runner thread stopped.
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent0] direct buffer constructor: unavailable
java.lang.UnsupportedOperationException: Reflective setAccessible(true) disabled
        at io.netty.util.internal.ReflectionUtil.trySetAccessible(ReflectionUtil.java:31)
        at io.netty.util.internal.PlatformDependent0$4.run(PlatformDependent0.java:224)
        at java.base/java.security.AccessController.doPrivileged(Native Method)
        at io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:218)
        at io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:212)
        at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:80)
        at io.netty.util.ConstantPool.<init>(ConstantPool.java:32)
        at io.netty.util.AttributeKey$1.<init>(AttributeKey.java:27)
        at io.netty.util.AttributeKey.<clinit>(AttributeKey.java:27)
        at org.elasticsearch.transport.netty4.Netty4Transport.<clinit>(Netty4Transport.java:219)
        at org.elasticsearch.transport.Netty4Plugin.getSettings(Netty4Plugin.java:57)
        at org.elasticsearch.plugins.PluginsService.lambda$getPluginSettings$0(PluginsService.java:89)
        at java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:271)
        at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1654)
        at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
        at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
        at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913)
        at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578)
        at org.elasticsearch.plugins.PluginsService.getPluginSettings(PluginsService.java:89)
        at org.elasticsearch.client.transport.TransportClient.buildTemplate(TransportClient.java:147)
        at org.elasticsearch.client.transport.TransportClient.<init>(TransportClient.java:277)
        at org.sonar.application.es.EsConnectorImpl$MinimalTransportClient.<init>(EsConnectorImpl.java:103)
        at org.sonar.application.es.EsConnectorImpl.buildTransportClient(EsConnectorImpl.java:89)
        at org.sonar.application.es.EsConnectorImpl.getTransportClient(EsConnectorImpl.java:74)
        at org.sonar.application.es.EsConnectorImpl.getClusterHealthStatus(EsConnectorImpl.java:61)
        at org.sonar.application.process.EsManagedProcess.checkStatus(EsManagedProcess.java:88)
        at org.sonar.application.process.EsManagedProcess.checkOperational(EsManagedProcess.java:73)
        at org.sonar.application.process.EsManagedProcess.isOperational(EsManagedProcess.java:58)
        at org.sonar.application.process.ManagedProcessHandler.refreshState(ManagedProcessHandler.java:201)
        at org.sonar.application.process.ManagedProcessHandler$EventWatcher.run(ManagedProcessHandler.java:258)
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent0] java.nio.Bits.unaligned: available, true
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent0] jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable
java.lang.IllegalAccessException: class io.netty.util.internal.PlatformDependent0$6 cannot access class jdk.internal.misc.Unsafe (in module java.base) because module java.base does not export jdk.internal.misc to unnamed module @13deb50e
        at java.base/jdk.internal.reflect.Reflection.newIllegalAccessException(Reflection.java:361)
        at java.base/java.lang.reflect.AccessibleObject.checkAccess(AccessibleObject.java:591)
        at java.base/java.lang.reflect.Method.invoke(Method.java:558)
        at io.netty.util.internal.PlatformDependent0$6.run(PlatformDependent0.java:334)
        at java.base/java.security.AccessController.doPrivileged(Native Method)
        at io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:325)
        at io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:212)
        at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:80)
        at io.netty.util.ConstantPool.<init>(ConstantPool.java:32)
        at io.netty.util.AttributeKey$1.<init>(AttributeKey.java:27)
        at io.netty.util.AttributeKey.<clinit>(AttributeKey.java:27)
        at org.elasticsearch.transport.netty4.Netty4Transport.<clinit>(Netty4Transport.java:219)
        at org.elasticsearch.transport.Netty4Plugin.getSettings(Netty4Plugin.java:57)
        at org.elasticsearch.plugins.PluginsService.lambda$getPluginSettings$0(PluginsService.java:89)
        at java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:271)
        at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1654)
        at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
        at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
        at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913)
        at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578)
        at org.elasticsearch.plugins.PluginsService.getPluginSettings(PluginsService.java:89)
        at org.elasticsearch.client.transport.TransportClient.buildTemplate(TransportClient.java:147)
        at org.elasticsearch.client.transport.TransportClient.<init>(TransportClient.java:277)
        at org.sonar.application.es.EsConnectorImpl$MinimalTransportClient.<init>(EsConnectorImpl.java:103)
        at org.sonar.application.es.EsConnectorImpl.buildTransportClient(EsConnectorImpl.java:89)
        at org.sonar.application.es.EsConnectorImpl.getTransportClient(EsConnectorImpl.java:74)
        at org.sonar.application.es.EsConnectorImpl.getClusterHealthStatus(EsConnectorImpl.java:61)
        at org.sonar.application.process.EsManagedProcess.checkStatus(EsManagedProcess.java:88)
        at org.sonar.application.process.EsManagedProcess.checkOperational(EsManagedProcess.java:73)
        at org.sonar.application.process.EsManagedProcess.isOperational(EsManagedProcess.java:58)
        at org.sonar.application.process.ManagedProcessHandler.refreshState(ManagedProcessHandler.java:201)
        at org.sonar.application.process.ManagedProcessHandler$EventWatcher.run(ManagedProcessHandler.java:258)
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent0] java.nio.DirectByteBuffer.<init>(long, int): unavailable
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent] sun.misc.Unsafe: available
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent] maxDirectMemory: 3221225472 bytes (maybe)
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.bitMode: 64 (sun.arch.data.model)
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.maxDirectMemory: -1 bytes
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.uninitializedArrayAllocationThreshold: -1
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.CleanerJava9] java.nio.ByteBuffer.cleaner(): available
2019.08.22 07:39:07 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.noPreferDirect: false
read a packet STOP : 0
JVM requested a shutdown. (0)
wrapperStopProcess(0) called.
Sending stop signal to JVM
send a packet STOP : NULL
Send a packet START_PENDING : 5000
read a packet START_PENDING : 5000
JVM signalled a start pending with waitHint of 5000 millis.
2019.08.22 07:39:08 DEBUG app[][o.e.c.t.TransportClientNodesService] node_sampler_interval[5s]
2019.08.22 07:39:08 DEBUG app[][i.n.c.MultithreadEventLoopGroup] -Dio.netty.eventLoopThreads: 8
2019.08.22 07:39:08 DEBUG app[][i.n.c.n.NioEventLoop] -Dio.netty.noKeySetOptimization: false
2019.08.22 07:39:08 DEBUG app[][i.n.c.n.NioEventLoop] -Dio.netty.selectorAutoRebuildThreshold: 512
2019.08.22 07:39:08 DEBUG app[][i.n.u.i.PlatformDependent] org.jctools-core.MpscChunkedArrayQueue: available
2019.08.22 07:39:08 DEBUG app[][o.e.c.t.TransportClientNodesService] adding address [{#transport#-1}{c4qhrVqEQnKsSGa1RHH0hw}{127.0.0.1}{127.0.0.1:9001}]
2019.08.22 07:39:08 DEBUG app[][i.n.c.DefaultChannelId] -Dio.netty.processId: 23896 (auto-detected)
2019.08.22 07:39:08 DEBUG app[][i.netty.util.NetUtil] -Djava.net.preferIPv4Stack: false
2019.08.22 07:39:08 DEBUG app[][i.netty.util.NetUtil] -Djava.net.preferIPv6Addresses: false
2019.08.22 07:39:08 DEBUG app[][i.netty.util.NetUtil] Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%lo)
2019.08.22 07:39:08 DEBUG app[][i.netty.util.NetUtil] /proc/sys/net/core/somaxconn: 128
2019.08.22 07:39:08 DEBUG app[][i.n.c.DefaultChannelId] -Dio.netty.machineId: 0a:46:74:ff:fe:66:b1:18 (auto-detected)
2019.08.22 07:39:08 DEBUG app[][i.n.u.i.InternalThreadLocalMap] -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
2019.08.22 07:39:08 DEBUG app[][i.n.u.i.InternalThreadLocalMap] -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
2019.08.22 07:39:08 DEBUG app[][i.n.u.ResourceLeakDetector] -Dio.netty.leakDetection.level: simple
2019.08.22 07:39:08 DEBUG app[][i.n.u.ResourceLeakDetector] -Dio.netty.leakDetection.targetRecords: 4
2019.08.22 07:39:08 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.numHeapArenas: 8
2019.08.22 07:39:08 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.numDirectArenas: 8
2019.08.22 07:39:08 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.pageSize: 8192
2019.08.22 07:39:08 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.maxOrder: 11
2019.08.22 07:39:08 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.chunkSize: 16777216
2019.08.22 07:39:08 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.tinyCacheSize: 512
2019.08.22 07:39:08 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.smallCacheSize: 256
2019.08.22 07:39:08 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.normalCacheSize: 64
2019.08.22 07:39:08 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.maxCachedBufferCapacity: 32768
2019.08.22 07:39:08 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.cacheTrimInterval: 8192
2019.08.22 07:39:08 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.useCacheForAllThreads: true
2019.08.22 07:39:08 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.allocator.type: pooled
2019.08.22 07:39:08 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.threadLocalDirectBufferSize: 0
2019.08.22 07:39:08 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.maxThreadLocalCharBufferSize: 16384
2019.08.22 07:39:08 INFO  app[][o.e.c.t.TransportClientNodesService] failed to get node info for {#transport#-1}{c4qhrVqEQnKsSGa1RHH0hw}{127.0.0.1}{127.0.0.1:9001}, disconnecting...
java.lang.IllegalStateException: Future got interrupted
        at org.elasticsearch.common.util.concurrent.FutureUtils.get(FutureUtils.java:60)
        at org.elasticsearch.action.support.AdapterActionFuture.actionGet(AdapterActionFuture.java:34)
        at org.elasticsearch.transport.ConnectionManager.internalOpenConnection(ConnectionManager.java:209)
        at org.elasticsearch.transport.ConnectionManager.openConnection(ConnectionManager.java:80)
        at org.elasticsearch.transport.TransportService.openConnection(TransportService.java:367)
        at org.elasticsearch.client.transport.TransportClientNodesService$SimpleNodeSampler.doSample(TransportClientNodesService.java:411)
        at org.elasticsearch.client.transport.TransportClientNodesService$NodeSampler.sample(TransportClientNodesService.java:362)
        at org.elasticsearch.client.transport.TransportClientNodesService.addTransportAddresses(TransportClientNodesService.java:201)
        at org.elasticsearch.client.transport.TransportClient.addTransportAddress(TransportClient.java:342)
        at org.sonar.application.es.EsConnectorImpl$MinimalTransportClient.<init>(EsConnectorImpl.java:108)
        at org.sonar.application.es.EsConnectorImpl.buildTransportClient(EsConnectorImpl.java:89)
        at org.sonar.application.es.EsConnectorImpl.getTransportClient(EsConnectorImpl.java:74)
        at org.sonar.application.es.EsConnectorImpl.getClusterHealthStatus(EsConnectorImpl.java:61)
        at org.sonar.application.process.EsManagedProcess.checkStatus(EsManagedProcess.java:88)
        at org.sonar.application.process.EsManagedProcess.checkOperational(EsManagedProcess.java:73)
        at org.sonar.application.process.EsManagedProcess.isOperational(EsManagedProcess.java:58)
        at org.sonar.application.process.ManagedProcessHandler.refreshState(ManagedProcessHandler.java:201)
        at org.sonar.application.process.ManagedProcessHandler$EventWatcher.run(ManagedProcessHandler.java:258)
Caused by: java.lang.InterruptedException: null
        at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1343)
        at org.elasticsearch.common.util.concurrent.BaseFuture$Sync.get(BaseFuture.java:251)
        at org.elasticsearch.common.util.concurrent.BaseFuture.get(BaseFuture.java:94)
        at org.elasticsearch.common.util.concurrent.FutureUtils.get(FutureUtils.java:57)
        ... 17 common frames omitted
2019.08.22 07:39:08 DEBUG app[][o.s.a.e.EsConnectorImpl] Connected to Elasticsearch node: [127.0.0.1:9001]
Thread, Wrapper-Shutdown-Hook, handling the shutdown process.
shutdownJVM(0) Thread:Wrapper-Shutdown-Hook
Send a packet STOPPED : 0
read a packet STOPPED : 0
JVM signalled that it was stopped.
Closing socket.

Please suggest what all changes i need to do to get the application started.

Hi,

Welcome to the community!

It’s not clear which log we’re looking at here; there seems to be some non-SonarQube stuff going on. Can you check your server logs and see if you find errors that are more focused?

 
Ann

Hi Ann
These are all the sonar logs which have been taken after taking the logs in debug mode.

Another set of logs are below which happened on a windows instances ,

Wrapper Started as Console
Launching a JVM…
WrapperManager class initialized by thread: main Using classloader: jdk.internal.loader.ClassLoaders$AppClassLoader@2cdf8d8a
Wrapper (Version 3.2.3) http://wrapper.tanukisoftware.org
Copyright 1999-2006 Tanuki Software, Inc. All Rights Reserved.

Wrapper Manager: JVM #1
Running a 64-bit JVM.
Wrapper Manager: Registering shutdown hook
Wrapper Manager: Using wrapper
Load native library. One or more attempts may fail if platform specific libraries do not exist.
Loading native library failed: wrapper-windows-x86-64.dll Cause: java.lang.UnsatisfiedLinkError: no wrapper-windows-x86-64 in java.library.path: [./lib]
Loaded native library: wrapper.dll
Calling native initialization method.
Initializing WrapperManager native library.
Java Executable: C:\Program Files\Java\jdk-11.0.4\bin\Java.exe
Windows version: 10.0.14393
Java Version : 11.0.4+10-LTS Java HotSpot™ 64-Bit Server VM
Java VM Vendor : Oracle Corporation

Control event monitor thread started.
Startup runner thread started.
WrapperManager.start(org.tanukisoftware.wrapper.WrapperSimpleApp@64729b1e, args) called by thread: main
Communications runner thread started.
Open socket to wrapper…Wrapper-Connection
Opened Socket from 31000 to 32000
Send a packet KEY : jKcKiW1ls0G0AZvf
handleSocket(Socket[addr=/127.0.0.1,port=32000,localport=31000])
Received a packet LOW_LOG_LEVEL : 1
Wrapper Manager: LowLogLevel from Wrapper is 1
Received a packet PING_TIMEOUT : 0
PingTimeout from Wrapper is 0
Received a packet PROPERTIES : (Property Values)
Received a packet START : start
calling WrapperListener.start()
Waiting for WrapperListener.start runner thread to complete.
WrapperListener.start runner thread started.
WrapperSimpleApp: start(args) Will wait up to 2 seconds for the main method to complete.
WrapperSimpleApp: invoking main method
2019.08.22 18:02:42 INFO app[o.s.a.AppFileSystem] Cleaning or creating temp directory D:\SonarQube\sonarqube-7.9.1\temp
2019.08.22 18:02:42 INFO app[o.s.a.es.EsSettings] Elasticsearch listening on /127.0.0.1:9001
2019.08.22 18:02:42 INFO app[o.s.a.ProcessLauncherImpl] Launch process[[key=‘es’, ipcIndex=1, logFilenamePrefix=es]] from [D:\SonarQube\sonarqube-7.9.1\elasticsearch]: C:\Program Files\Java\jdk-11.0.4\bin\java -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -Des.networkaddress.cache.ttl=60 -Des.networkaddress.cache.negative.ttl=10 -XX:+AlwaysPreTouch -Xss1m -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djna.nosys=true -XX:-OmitStackTraceInFastThrow -Dio.netty.noUnsafe=true -Dio.netty.noKeySetOptimization=true -Dio.netty.recycler.maxCapacityPerThread=0 -Dlog4j.shutdownHookEnabled=false -Dlog4j2.disable.jmx=true -Djava.io.tmpdir=D:\SonarQube\sonarqube-7.9.1\temp -XX:ErrorFile=…/logs/es_hs_err_pid%p.log -Des.enforce.bootstrap.checks=true -Xms512m -Xmx512m -XX:+HeapDumpOnOutOfMemoryError -Delasticsearch -Des.path.home=D:\SonarQube\sonarqube-7.9.1\elasticsearch -Des.path.conf=D:\SonarQube\sonarqube-7.9.1\temp\conf\es -cp lib/* org.elasticsearch.bootstrap.Elasticsearch
2019.08.22 18:02:42 INFO app[o.s.a.SchedulerImpl] Waiting for Elasticsearch to be up and running
Java HotSpot™ 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
2019.08.22 18:02:43 INFO app[o.e.p.PluginsService] no modules loaded
2019.08.22 18:02:43 INFO app[o.e.p.PluginsService] loaded plugin [org.elasticsearch.transport.Netty4Plugin]
Send a packet START_PENDING : 5000
Send a packet START_PENDING : 5000
WrapperSimpleApp: start(args) end. Main Completed=false, exitCode=null
WrapperListener.start runner thread stopped.
returned from WrapperListener.start()
Send a packet STARTED :
Startup runner thread stopped.
Received a packet PING : ping
Send a packet PING : ok
Received a packet PING : ping
Send a packet PING : ok
2019.08.22 18:02:56 INFO app[o.s.a.SchedulerImpl] Process[es] is up
2019.08.22 18:02:56 INFO app[o.s.a.ProcessLauncherImpl] Launch process[[key=‘web’, ipcIndex=2, logFilenamePrefix=web]] from [D:\SonarQube\sonarqube-7.9.1]: C:\Program Files\Java\jdk-11.0.4\bin\java -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djava.io.tmpdir=D:\SonarQube\sonarqube-7.9.1\temp --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.rmi/sun.rmi.transport=ALL-UNNAMED -Xms512m -Xmx512m -XX:+HeapDumpOnOutOfMemoryError -Dhttp.nonProxyHosts=localhost|127.|[::1] -cp ./lib/common/;D:\SonarQube\sonarqube-7.9.1\lib\jdbc\postgresql\postgresql-42.2.5.jar org.sonar.server.app.WebServer D:\SonarQube\sonarqube-7.9.1\temp\sq-process3612210359605852939properties
2019.08.22 18:02:58 INFO app[o.s.a.SchedulerImpl] Process[web] is stopped
2019.08.22 18:02:58 INFO app[o.s.a.SchedulerImpl] Process[es] is stopped
2019.08.22 18:02:58 INFO app[o.s.a.SchedulerImpl] SonarQube is stopped
Wrapper Manager: ShutdownHook started
WrapperManager.stop(0) called by thread: Wrapper-Shutdown-Hook
Send a packet STOP : 0
Received a packet STOP :
Thread, Wrapper-Shutdown-Hook, handling the shutdown process.
calling listener.stop()
WrapperSimpleApp: stop(0)
returned from listener.stop() -> 0
shutdownJVM(0) Thread:Wrapper-Shutdown-Hook
Send a packet STOPPED : 0
Closing socket.
Wrapper Manager: ShutdownHook complete
Server daemon shut down
<-- Wrapper Stopped