Hi,
Again while restarting sonarqube 7.7 getting following error. Can you suggest what is wrong?
2019.06.18 16:41:54 INFO app[][o.s.a.AppFileSystem] Cleaning or creating temp directory /utxsonar/app/sonarqube-7.7/temp
2019.06.18 16:41:54 INFO app[][o.s.a.es.EsSettings] Elasticsearch listening on /127.0.0.1:9001
2019.06.18 16:41:54 INFO app[][o.s.a.p.ProcessLauncherImpl] Launch process[[key='es', ipcIndex=1, logFilenamePrefix=es]] from [/utxsonar/app/sonarqube-7.7/elasticsearch]: /utxsonar/app/sonarqube-7.7/elasticsearch/bin/elasticsearch
2019.06.18 16:41:54 INFO app[][o.s.a.SchedulerImpl] Waiting for Elasticsearch to be up and running
2019.06.18 16:41:54 INFO app[][o.e.p.PluginsService] no modules loaded
2019.06.18 16:41:54 INFO app[][o.e.p.PluginsService] loaded plugin [org.elasticsearch.transport.Netty4Plugin]
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [force_merge], size [1], queue size [unbounded]
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [fetch_shard_started], core [1], max [16], keep alive [5m]
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [listener], size [4], queue size [unbounded]
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [index], size [8], queue size [200]
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [refresh], core [1], max [4], keep alive [5m]
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [generic], core [4], max [128], keep alive [30s]
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [warmer], core [1], max [4], keep alive [5m]
2019.06.18 16:41:54 DEBUG app[][o.e.c.u.c.QueueResizingEsThreadPoolExecutor] thread pool [_client_/search] will adjust queue by [50] when determining automatic queue size
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [search], size [13], queue size [1k]
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [flush], core [1], max [4], keep alive [5m]
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [fetch_shard_store], core [1], max [16], keep alive [5m]
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [management], core [1], max [5], keep alive [5m]
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [get], size [8], queue size [1k]
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [analyze], size [1], queue size [16]
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [write], size [8], queue size [200]
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [snapshot], core [1], max [4], keep alive [5m]
2019.06.18 16:41:54 DEBUG app[][o.e.c.u.c.QueueResizingEsThreadPoolExecutor] thread pool [_client_/search_throttled] will adjust queue by [50] when determining automatic queue size
2019.06.18 16:41:54 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [search_throttled], size [1], queue size [100]
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent0] -Dio.netty.noUnsafe: false
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent0] Java version: 8
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent0] sun.misc.Unsafe.theUnsafe: available
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent0] sun.misc.Unsafe.copyMemory: available
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent0] java.nio.Buffer.address: available
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent0] direct buffer constructor: available
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent0] java.nio.Bits.unaligned: available, true
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent0] jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable prior to Java9
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent0] java.nio.DirectByteBuffer.<init>(long, int): available
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent] sun.misc.Unsafe: available
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.bitMode: 64 (sun.arch.data.model)
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.maxDirectMemory: 29884416 bytes
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.uninitializedArrayAllocationThreshold: -1
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.CleanerJava6] java.nio.ByteBuffer.cleaner(): available
2019.06.18 16:41:54 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.noPreferDirect: false
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Module execution: 69ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] TypeListeners creation: 3ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Scopes creation: 4ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Converters creation: 0ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Binding creation: 3ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Private environment creation: 0ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Injector construction: 0ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Binding initialization: 0ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Binding indexing: 0ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Collecting injection requests: 0ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Binding validation: 0ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Static validation: 0ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Instance member validation: 0ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Provider verification: 0ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Static member injection: 0ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Instance injection: 0ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.i.i.Stopwatch] Preloading singletons: 0ms
2019.06.18 16:41:55 DEBUG app[][o.e.c.t.TransportClientNodesService] node_sampler_interval[5s]
2019.06.18 16:41:55 DEBUG app[][i.n.c.MultithreadEventLoopGroup] -Dio.netty.eventLoopThreads: 16
2019.06.18 16:41:55 DEBUG app[][i.n.c.n.NioEventLoop] -Dio.netty.noKeySetOptimization: false
2019.06.18 16:41:55 DEBUG app[][i.n.c.n.NioEventLoop] -Dio.netty.selectorAutoRebuildThreshold: 512
2019.06.18 16:41:55 DEBUG app[][i.n.u.i.PlatformDependent] org.jctools-core.MpscChunkedArrayQueue: available
2019.06.18 16:41:56 DEBUG app[][o.e.c.t.TransportClientNodesService] adding address [{#transport#-1}{lYNybPrLRA2v4w72cloQZg}{127.0.0.1}{127.0.0.1:9001}]
2019.06.18 16:41:56 DEBUG app[][i.n.c.DefaultChannelId] -Dio.netty.processId: 27638 (auto-detected)
2019.06.18 16:41:56 DEBUG app[][i.netty.util.NetUtil] -Djava.net.preferIPv4Stack: false
2019.06.18 16:41:56 DEBUG app[][i.netty.util.NetUtil] -Djava.net.preferIPv6Addresses: false
2019.06.18 16:41:56 DEBUG app[][i.netty.util.NetUtil] Loopback interface: lo (lo, 127.0.0.1)
2019.06.18 16:41:56 DEBUG app[][i.netty.util.NetUtil] /proc/sys/net/core/somaxconn: 128
2019.06.18 16:41:56 DEBUG app[][i.n.c.DefaultChannelId] -Dio.netty.machineId: 00:50:56:ff:fe:bf:73:23 (auto-detected)
2019.06.18 16:41:56 DEBUG app[][i.n.u.i.InternalThreadLocalMap] -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
2019.06.18 16:41:56 DEBUG app[][i.n.u.i.InternalThreadLocalMap] -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
2019.06.18 16:41:56 DEBUG app[][i.n.u.ResourceLeakDetector] -Dio.netty.leakDetection.level: simple
2019.06.18 16:41:56 DEBUG app[][i.n.u.ResourceLeakDetector] -Dio.netty.leakDetection.targetRecords: 4
2019.06.18 16:41:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.numHeapArenas: 0
2019.06.18 16:41:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.numDirectArenas: 0
2019.06.18 16:41:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.pageSize: 8192
2019.06.18 16:41:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.maxOrder: 11
2019.06.18 16:41:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.chunkSize: 16777216
2019.06.18 16:41:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.tinyCacheSize: 512
2019.06.18 16:41:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.smallCacheSize: 256
2019.06.18 16:41:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.normalCacheSize: 64
2019.06.18 16:41:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.maxCachedBufferCapacity: 32768
2019.06.18 16:41:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.cacheTrimInterval: 8192
2019.06.18 16:41:56 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.useCacheForAllThreads: true
2019.06.18 16:41:56 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.allocator.type: pooled
2019.06.18 16:41:56 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.threadLocalDirectBufferSize: 0
2019.06.18 16:41:56 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.maxThreadLocalCharBufferSize: 16384
2019.06.18 16:41:56 DEBUG app[][o.e.c.t.TransportClientNodesService] failed to connect to node [{#transport#-1}{lYNybPrLRA2v4w72cloQZg}{127.0.0.1}{127.0.0.1:9001}], ignoring...
org.elasticsearch.transport.ConnectTransportException: [][127.0.0.1:9001] connect_exception
at org.elasticsearch.transport.TcpTransport$ChannelsConnectedListener.onFailure(TcpTransport.java:1570)
at org.elasticsearch.action.ActionListener.lambda$toBiConsumer$2(ActionListener.java:99)
at org.elasticsearch.common.concurrent.CompletableContext.lambda$addListener$0(CompletableContext.java:42)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:760)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:736)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977)
at org.elasticsearch.common.concurrent.CompletableContext.completeExceptionally(CompletableContext.java:57)
at org.elasticsearch.transport.netty4.Netty4TcpChannel.lambda$new$1(Netty4TcpChannel.java:72)
at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:511)
at io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:504)
at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:483)
at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:424)
at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:121)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.fulfillConnectPromise(AbstractNioChannel.java:327)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:343)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:591)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:508)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:470)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:909)
at java.lang.Thread.run(Thread.java:748)
Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: /127.0.0.1:9001
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:327)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340)
... 6 common frames omitted
Caused by: java.net.ConnectException: Connection refused
... 10 common frames omitted
2019.06.18 16:41:56 DEBUG app[][o.s.a.e.EsConnectorImpl] Connected to Elasticsearch node: [127.0.0.1:9001]
2019.06.18 16:42:00 DEBUG app[][i.n.b.AbstractByteBuf] -Dio.netty.buffer.checkAccessible: true
2019.06.18 16:42:00 DEBUG app[][i.n.b.AbstractByteBuf] -Dio.netty.buffer.checkBounds: true
2019.06.18 16:42:00 DEBUG app[][i.n.u.ResourceLeakDetectorFactory] Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@4d926874
2019.06.18 16:42:00 DEBUG app[][i.n.util.Recycler] -Dio.netty.recycler.maxCapacityPerThread: 4096
2019.06.18 16:42:00 DEBUG app[][i.n.util.Recycler] -Dio.netty.recycler.maxSharedCapacityFactor: 2
2019.06.18 16:42:00 DEBUG app[][i.n.util.Recycler] -Dio.netty.recycler.linkCapacity: 16
2019.06.18 16:42:00 DEBUG app[][i.n.util.Recycler] -Dio.netty.recycler.ratio: 8
2019.06.18 16:42:01 DEBUG app[][o.e.t.ConnectionManager] connected to node [{sonarqube}{IQ_2bVpMRrevvPw2bxZiBg}{hrtzPKPfQVWTw0-HNGB3OQ}{127.0.0.1}{127.0.0.1:9001}{rack_id=sonarqube}]
2019.06.18 16:42:03 INFO app[][o.s.a.SchedulerImpl] Process[es] is up
2019.06.18 16:42:03 INFO app[][o.s.a.p.ProcessLauncherImpl] Launch process[[key='web', ipcIndex=2, logFilenamePrefix=web]] from [/utxsonar/app/sonarqube-7.7]: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.212.b04-0.el6_10.x86_64/jre/bin/java -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djava.io.tmpdir=/utxsonar/app/sonarqube-7.7/temp -Xmx512m -Xms128m -XX:+HeapDumpOnOutOfMemoryError -cp ./lib/common/*:/utxsonar/app/sonarqube-7.7/lib/jdbc/postgresql/postgresql-42.2.5.jar org.sonar.server.app.WebServer /utxsonar/app/sonarqube-7.7/temp/sq-process2992775761946795477properties
2019.06.18 16:42:06 DEBUG app[][o.s.a.p.AbstractProcessMonitor] Process exited with exit value [web]: 0
2019.06.18 16:42:06 INFO app[][o.s.a.SchedulerImpl] Process [web] is stopped
2019.06.18 16:42:07 INFO app[][o.s.a.SchedulerImpl] Process [es] is stopped
2019.06.18 16:42:07 WARN app[][o.s.a.p.AbstractProcessMonitor] Process exited with exit value [es]: 143
2019.06.18 16:42:07 INFO app[][o.s.a.SchedulerImpl] SonarQube is stopped
<-- Wrapper Stopped