Issues when upgrading to 7.9

Hi,
i was upgrading SQ instance from 7.3 to 7.9 linux version & the DB i was using is microsoft SQL server. ( using port 8080 to bring Up instance )

followed the upgrade notes ( upgraded to java jdk to 11 )
Using the the plugin that comes with the version .

Experiencing Below issues based on logs:

web.log :

2019.07.03 08:21:23 DEBUG web[][j.m.mbeanserver] JMX.mbean.registered Tomcat:type=GlobalRequestProcessor,name="http-nio-10.x.x.x-8080"
2019.07.03 08:21:23 ERROR web[][o.a.c.c.StandardService] Failed to initialize connector [Connector[HTTP/1.1-8080]]
org.apache.catalina.LifecycleException: Failed to initialize component [Connector[HTTP/1.1-8080]]
        at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:112)
        at org.apache.catalina.core.StandardService.initInternal(StandardService.java:552)
        at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:107)019.07.03 08:21:23 WARN  web[][o.s.p.ProcessEntryPoint] Fail to start web
java.lang.RuntimeException: org.apache.catalina.LifecycleException: Failed to initialize component [StandardServer[-1]]
        at com.google.common.base.Throwables.propagate(Throwables.java:160)
Caused by: org.apache.catalina.LifecycleException: Failed to initialize connector [Connector[HTTP/1.1-8080]]
        at org.apache.catalina.core.StandardService.initInternal(StandardService.java:559)
        at org.apache.catalina.util.LifecycleBase.init(LifecycleBase.java:107)
        ... 9 common frames omitted
2019.07.03 08:21:23 TRACE web[][o.s.p.Lifecycle] tryToMoveTo from STARTING to HARD_STOPPING => true
2019.07.03 08:21:23 INFO  web[][o.s.p.ProcessEntryPoint] Hard stopping process

sonar.log:

2019.07.03 08:21:24 TRACE app[][o.e.t.TcpTransport] Tcp transport client channel opened: Netty4TcpChannel{localAddress=null, remoteAddress=null}
2019.07.03 08:21:24 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0x4c843bbe] REGISTERED
2019.07.03 08:21:24 TRACE app[][o.e.t.n.ESLoggingHandler] [id: 0x4c843bbe] CONNECT: /127.0.0.1:9001
2019.07.03 08:21:24 TRACE app[][o.e.i.b.in_flight_requests] [in_flight_requests] Adjusted breaker by [16440] bytes, now [16440]
2019.07.03 08:21:24 TRACE app[][o.e.t.TransportLogger] an exception occurred formatting a WRITE trace message
java.io.EOFException: tried to read: 105 bytes but only 27 remaining
        at org.elasticsearch.common.bytes.BytesReferenceStreamInput.ensureCanReadBytes(BytesReferenceStreamInput.java:121)
        at org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper.ensureCanReadBytes(BytesReference.java:283)
        at org.elasticsearch.common.io.stream.StreamInput.readArraySize(StreamInput.java:1057)

es.log:

2019.07.03 08:21:22 DEBUG es[][i.n.u.i.PlatformDependent] -Dio.netty.uninitializedArrayAllocationThreshold: -1
2019.07.03 08:21:22 DEBUG es[][i.n.u.i.CleanerJava9] java.nio.ByteBuffer.cleaner(): unavailable
java.lang.UnsupportedOperationException: sun.misc.Unsafe unavailable
       at io.netty.util.internal.CleanerJava9.<clinit>(CleanerJava9.java:68) [netty-common-4.1.32.Final.jar:4.1.32.Final]
       at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:172) [netty-common-4.1.32.Final.jar:4.1.32.Final]
       at io.netty.util.ConstantPool.<init>(ConstantPool.java:32) [netty-common-4.1.32.Final.jar:4.1.32.Final]
       at io.netty.util.AttributeKey$1.<init>(AttributeKey.jav```

would appreciate if some one has fix or some more details on it. i can provide some more details if needed.

Hi,

Do the logs really end there, or are the stacktraces longer? Also, I guess you omitted ce.log because there’s nothing remarkable in it?

And just to be clear, this

means you’ve configured SonarQube to run on 8080 instead of 9000? (Funny, when I see 8080, I think Jenkins :slight_smile:.) You confirm that the port is/was actually available, right?

 
Thx,
Ann

Hi again :slight_smile:,

Could you be real specific about your Java version & vendor, please? After leaving my initial answer, I saw that another user is also has a log that says Unsafe isn’t available.

 
Thx,
Ann

Hi ,

  1. The app did’nt generated ce.logs .
    2.i am using java-11-openjdk.x86_64 (/usr/lib/jvm/java-11-openjdk-11.0.3.7-0.el7_6.x86_64/bin/java …
  2. i am using port 8080 even in my current instance , since nothing on my server is occupied for that .

Hi Ann,

not sure about that, but hasn’t sun.misc.unsafe been removed with JDK 11 !?
Had no problems upgrading from SQ Enterprise 7.8 running on Oracle JDK 8 to 7.9
on Windows Server 2012 and Windows Server 2016 with openjdk 11 today.

Gilbert

Hello,

Indeed, it’s been removed.
This message is not an error. Look at log level: it’s TRACE.
This is only Netty reporting the exception it encountered when it made an attempt to access sun.misc.Unsafe to check whether it could use it or not.

1 Like

Hello @jvishnu066,

Please provide the whole content of the web.log and sonar.log from the moment you start SQ. In your initial post, they clearly are truncated.

Hello,

Here are the logs for the SQ 7.9 instance sonar.log. Though now i dont see they are generating web.log

Launching a JVM...
Wrapper (Version 3.2.3) http://wrapper.tanukisoftware.org
  Copyright 1999-2006 Tanuki Software, Inc.  All Rights Reserved.

2019.07.08 07:59:39 INFO  app[][o.s.a.AppFileSystem] Cleaning or creating temp directory /tmp/sonarqube-7.9/temp
2019.07.08 07:59:39 INFO  app[][o.s.a.es.EsSettings] Elasticsearch listening on /127.0.0.1:9001
2019.07.08 07:59:39 INFO  app[][o.s.a.ProcessLauncherImpl] Launch process[[key='es', ipcIndex=1, logFilenamePrefix=es]] from [/tmp/sonarqube-7.9/elasticsearch]: /tmp/sonarqube-7.9/elasticsearch/bin/elasticsearch
2019.07.08 07:59:39 INFO  app[][o.s.a.SchedulerImpl] Waiting for Elasticsearch to be up and running
2019.07.08 07:59:40 INFO  app[][o.e.p.PluginsService] no modules loaded
2019.07.08 07:59:40 INFO  app[][o.e.p.PluginsService] loaded plugin [org.elasticsearch.transport.Netty4Plugin]
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [force_merge], size [1], queue size [unbounded]
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [fetch_shard_started], core [1], max [4], keep alive [5m]
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [listener], size [1], queue size [unbounded]
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [index], size [2], queue size [200]
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [refresh], core [1], max [1], keep alive [5m]
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [generic], core [4], max [128], keep alive [30s]
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [warmer], core [1], max [1], keep alive [5m]
2019.07.08 07:59:40 DEBUG app[][o.e.c.u.c.QueueResizingEsThreadPoolExecutor] thread pool [_client_/search] will adjust queue by [50] when determining automatic queue size
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [search], size [4], queue size [1k]
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [flush], core [1], max [1], keep alive [5m]
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [fetch_shard_store], core [1], max [4], keep alive [5m]
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [management], core [1], max [5], keep alive [5m]
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [get], size [2], queue size [1k]
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [analyze], size [1], queue size [16]
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [write], size [2], queue size [200]
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [snapshot], core [1], max [1], keep alive [5m]
2019.07.08 07:59:40 DEBUG app[][o.e.c.u.c.QueueResizingEsThreadPoolExecutor] thread pool [_client_/search_throttled] will adjust queue by [50] when determining automatic queue size
2019.07.08 07:59:40 DEBUG app[][o.e.t.ThreadPool] created thread pool: name [search_throttled], size [1], queue size [100]
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent0] -Dio.netty.noUnsafe: false
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent0] Java version: 11
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent0] sun.misc.Unsafe.theUnsafe: available
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent0] sun.misc.Unsafe.copyMemory: available
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent0] java.nio.Buffer.address: available
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent0] direct buffer constructor: unavailable
java.lang.UnsupportedOperationException: Reflective setAccessible(true) disabled
        at io.netty.util.internal.ReflectionUtil.trySetAccessible(ReflectionUtil.java:31)
        at io.netty.util.internal.PlatformDependent0$4.run(PlatformDependent0.java:224)
        at java.base/java.security.AccessController.doPrivileged(Native Method)
        at io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:218)
        at io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:212)
        at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:80)
        at io.netty.util.ConstantPool.<init>(ConstantPool.java:32)
        at io.netty.util.AttributeKey$1.<init>(AttributeKey.java:27)
        at io.netty.util.AttributeKey.<clinit>(AttributeKey.java:27)
        at org.elasticsearch.transport.netty4.Netty4Transport.<clinit>(Netty4Transport.java:219)
        at org.elasticsearch.transport.Netty4Plugin.getSettings(Netty4Plugin.java:57)
        at org.elasticsearch.plugins.PluginsService.lambda$getPluginSettings$0(PluginsService.java:89)
        at java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:271)
        at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1654)
        at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
        at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
        at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913)
        at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578)
        at org.elasticsearch.plugins.PluginsService.getPluginSettings(PluginsService.java:89)
        at org.elasticsearch.client.transport.TransportClient.buildTemplate(TransportClient.java:147)
        at org.elasticsearch.client.transport.TransportClient.<init>(TransportClient.java:277)
        at org.sonar.application.es.EsConnectorImpl$MinimalTransportClient.<init>(EsConnectorImpl.java:103)
        at org.sonar.application.es.EsConnectorImpl.buildTransportClient(EsConnectorImpl.java:89)
        at org.sonar.application.es.EsConnectorImpl.getTransportClient(EsConnectorImpl.java:74)
        at org.sonar.application.es.EsConnectorImpl.getClusterHealthStatus(EsConnectorImpl.java:61)
        at org.sonar.application.process.EsManagedProcess.checkStatus(EsManagedProcess.java:88)
        at org.sonar.application.process.EsManagedProcess.checkOperational(EsManagedProcess.java:73)
        at org.sonar.application.process.EsManagedProcess.isOperational(EsManagedProcess.java:58)
        at org.sonar.application.process.ManagedProcessHandler.refreshState(ManagedProcessHandler.java:201)
        at org.sonar.application.process.ManagedProcessHandler$EventWatcher.run(ManagedProcessHandler.java:258)
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent0] java.nio.Bits.unaligned: available, true
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent0] jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable
java.lang.IllegalAccessException: class io.netty.util.internal.PlatformDependent0$6 cannot access class jdk.internal.misc.Unsafe (in module java.base) because module java.base does not export jdk.internal.misc to unnamed module @7e0b37bc
        at java.base/jdk.internal.reflect.Reflection.newIllegalAccessException(Reflection.java:361)
        at java.base/java.lang.reflect.AccessibleObject.checkAccess(AccessibleObject.java:591)
        at java.base/java.lang.reflect.Method.invoke(Method.java:558)
        at io.netty.util.internal.PlatformDependent0$6.run(PlatformDependent0.java:334)
        at java.base/java.security.AccessController.doPrivileged(Native Method)
        at io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:325)
        at io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:212)
        at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:80)
        at io.netty.util.ConstantPool.<init>(ConstantPool.java:32)
        at io.netty.util.AttributeKey$1.<init>(AttributeKey.java:27)
        at io.netty.util.AttributeKey.<clinit>(AttributeKey.java:27)
        at org.elasticsearch.transport.netty4.Netty4Transport.<clinit>(Netty4Transport.java:219)
        at org.elasticsearch.transport.Netty4Plugin.getSettings(Netty4Plugin.java:57)
        at org.elasticsearch.plugins.PluginsService.lambda$getPluginSettings$0(PluginsService.java:89)
        at java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:271)
        at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1654)
        at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
        at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
        at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913)
        at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578)
        at org.elasticsearch.plugins.PluginsService.getPluginSettings(PluginsService.java:89)
        at org.elasticsearch.client.transport.TransportClient.buildTemplate(TransportClient.java:147)
        at org.elasticsearch.client.transport.TransportClient.<init>(TransportClient.java:277)
        at org.sonar.application.es.EsConnectorImpl$MinimalTransportClient.<init>(EsConnectorImpl.java:103)
        at org.sonar.application.es.EsConnectorImpl.buildTransportClient(EsConnectorImpl.java:89)
        at org.sonar.application.es.EsConnectorImpl.getTransportClient(EsConnectorImpl.java:74)
        at org.sonar.application.es.EsConnectorImpl.getClusterHealthStatus(EsConnectorImpl.java:61)
        at org.sonar.application.process.EsManagedProcess.checkStatus(EsManagedProcess.java:88)
        at org.sonar.application.process.EsManagedProcess.checkOperational(EsManagedProcess.java:73)
        at org.sonar.application.process.EsManagedProcess.isOperational(EsManagedProcess.java:58)
        at org.sonar.application.process.ManagedProcessHandler.refreshState(ManagedProcessHandler.java:201)
        at org.sonar.application.process.ManagedProcessHandler$EventWatcher.run(ManagedProcessHandler.java:258)
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent0] java.nio.DirectByteBuffer.<init>(long, int): unavailable
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent] sun.misc.Unsafe: available
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent] maxDirectMemory: 33554432 bytes (maybe)
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.bitMode: 64 (sun.arch.data.model)
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.maxDirectMemory: -1 bytes
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.uninitializedArrayAllocationThreshold: -1
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.CleanerJava9] java.nio.ByteBuffer.cleaner(): available
2019.07.08 07:59:40 DEBUG app[][i.n.u.i.PlatformDependent] -Dio.netty.noPreferDirect: false
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Module execution: 202ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] TypeListeners creation: 2ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Scopes creation: 5ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Converters creation: 0ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Binding creation: 7ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Private environment creation: 0ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Injector construction: 0ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Binding initialization: 0ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Binding indexing: 0ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Collecting injection requests: 0ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Binding validation: 0ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Static validation: 0ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Instance member validation: 0ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Provider verification: 0ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Static member injection: 0ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Instance injection: 0ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.i.i.Stopwatch] Preloading singletons: 0ms
2019.07.08 07:59:42 DEBUG app[][o.e.c.t.TransportClientNodesService] node_sampler_interval[5s]
2019.07.08 07:59:42 DEBUG app[][i.n.c.MultithreadEventLoopGroup] -Dio.netty.eventLoopThreads: 4
2019.07.08 07:59:43 DEBUG app[][i.n.c.n.NioEventLoop] -Dio.netty.noKeySetOptimization: false
2019.07.08 07:59:43 DEBUG app[][i.n.c.n.NioEventLoop] -Dio.netty.selectorAutoRebuildThreshold: 512
2019.07.08 07:59:43 DEBUG app[][i.n.u.i.PlatformDependent] org.jctools-core.MpscChunkedArrayQueue: available
2019.07.08 07:59:43 DEBUG app[][o.e.c.t.TransportClientNodesService] adding address [{#transport#-1}{NgXJNsgMSwW_EMKaAqATsw}{127.0.0.1}{127.0.0.1:9001}]
2019.07.08 07:59:43 DEBUG app[][i.n.c.DefaultChannelId] -Dio.netty.processId: 8237 (auto-detected)
2019.07.08 07:59:43 DEBUG app[][i.netty.util.NetUtil] -Djava.net.preferIPv4Stack: false
2019.07.08 07:59:43 DEBUG app[][i.netty.util.NetUtil] -Djava.net.preferIPv6Addresses: false
2019.07.08 07:59:43 DEBUG app[][i.netty.util.NetUtil] Loopback interface: lo (lo, 127.0.0.1)
2019.07.08 07:59:43 DEBUG app[][i.netty.util.NetUtil] /proc/sys/net/core/somaxconn: 128
2019.07.08 07:59:43 DEBUG app[][i.n.c.DefaultChannelId] -Dio.netty.machineId: 00:50:56:ff:fe:80:ff:87 (auto-detected)
2019.07.08 07:59:43 DEBUG app[][i.n.u.i.InternalThreadLocalMap] -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
2019.07.08 07:59:43 DEBUG app[][i.n.u.i.InternalThreadLocalMap] -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
2019.07.08 07:59:43 DEBUG app[][i.n.u.ResourceLeakDetector] -Dio.netty.leakDetection.level: simple
2019.07.08 07:59:43 DEBUG app[][i.n.u.ResourceLeakDetector] -Dio.netty.leakDetection.targetRecords: 4
2019.07.08 07:59:43 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.numHeapArenas: 0
2019.07.08 07:59:43 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.numDirectArenas: 0
2019.07.08 07:59:43 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.pageSize: 8192
2019.07.08 07:59:43 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.maxOrder: 11
2019.07.08 07:59:43 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.chunkSize: 16777216
2019.07.08 07:59:43 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.tinyCacheSize: 512
2019.07.08 07:59:43 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.smallCacheSize: 256
2019.07.08 07:59:43 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.normalCacheSize: 64
2019.07.08 07:59:43 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.maxCachedBufferCapacity: 32768
2019.07.08 07:59:43 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.cacheTrimInterval: 8192
2019.07.08 07:59:43 DEBUG app[][i.n.b.PooledByteBufAllocator] -Dio.netty.allocator.useCacheForAllThreads: true
2019.07.08 07:59:43 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.allocator.type: pooled
2019.07.08 07:59:43 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.threadLocalDirectBufferSize: 0
2019.07.08 07:59:43 DEBUG app[][i.n.b.ByteBufUtil] -Dio.netty.maxThreadLocalCharBufferSize: 16384
2019.07.08 07:59:43 WARN  app[][o.s.a.p.AbstractManagedProcess] Process exited with exit value [es]: 1
2019.07.08 07:59:43 INFO  app[][o.s.a.SchedulerImpl] Process[es] is stopped
2019.07.08 07:59:43 INFO  app[][o.e.c.t.TransportClientNodesService] failed to get node info for {#transport#-1}{NgXJNsgMSwW_EMKaAqATsw}{127.0.0.1}{127.0.0.1:9001}, disconnecting...
java.lang.IllegalStateException: Future got interrupted
        at org.elasticsearch.common.util.concurrent.FutureUtils.get(FutureUtils.java:60)
        at org.elasticsearch.action.support.AdapterActionFuture.actionGet(AdapterActionFuture.java:34)
        at org.elasticsearch.transport.ConnectionManager.internalOpenConnection(ConnectionManager.java:209)
        at org.elasticsearch.transport.ConnectionManager.openConnection(ConnectionManager.java:80)
        at org.elasticsearch.transport.TransportService.openConnection(TransportService.java:367)
        at org.elasticsearch.client.transport.TransportClientNodesService$SimpleNodeSampler.doSample(TransportClientNodesService.java:411)
        at org.elasticsearch.client.transport.TransportClientNodesService$NodeSampler.sample(TransportClientNodesService.java:362)
        at org.elasticsearch.client.transport.TransportClientNodesService.addTransportAddresses(TransportClientNodesService.java:201)
        at org.elasticsearch.client.transport.TransportClient.addTransportAddress(TransportClient.java:342)
        at org.sonar.application.es.EsConnectorImpl$MinimalTransportClient.<init>(EsConnectorImpl.java:108)
        at org.sonar.application.es.EsConnectorImpl.buildTransportClient(EsConnectorImpl.java:89)
        at org.sonar.application.es.EsConnectorImpl.getTransportClient(EsConnectorImpl.java:74)
        at org.sonar.application.es.EsConnectorImpl.getClusterHealthStatus(EsConnectorImpl.java:61)
        at org.sonar.application.process.EsManagedProcess.checkStatus(EsManagedProcess.java:88)
        at org.sonar.application.process.EsManagedProcess.checkOperational(EsManagedProcess.java:73)
        at org.sonar.application.process.EsManagedProcess.isOperational(EsManagedProcess.java:58)
        at org.sonar.application.process.ManagedProcessHandler.refreshState(ManagedProcessHandler.java:201)
        at org.sonar.application.process.ManagedProcessHandler$EventWatcher.run(ManagedProcessHandler.java:258)
Caused by: java.lang.InterruptedException: null
        at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1343)
        at org.elasticsearch.common.util.concurrent.BaseFuture$Sync.get(BaseFuture.java:251)
        at org.elasticsearch.common.util.concurrent.BaseFuture.get(BaseFuture.java:94)
        at org.elasticsearch.common.util.concurrent.FutureUtils.get(FutureUtils.java:57)
        ... 17 common frames omitted
2019.07.08 07:59:43 DEBUG app[][o.s.a.e.EsConnectorImpl] Connected to Elasticsearch node: [127.0.0.1:9001]
2019.07.08 07:59:43 INFO  app[][o.s.a.SchedulerImpl] SonarQube is stopped
2019.07.08 07:59:43 DEBUG app[][o.s.a.SchedulerImpl] Stopping [ce]...
2019.07.08 07:59:43 DEBUG app[][o.s.a.SchedulerImpl] Stopping [web]...
2019.07.08 07:59:43 DEBUG app[][o.s.a.SchedulerImpl] Stopping [es]...
<-- Wrapper Stopped



WrapperSimpleApp: Encountered an error running main: java.lang.IllegalStateException: Cannot write Elasticsearch yml settings file
java.lang.IllegalStateException: Cannot write Elasticsearch yml settings file
        at org.sonar.application.es.EsYmlSettings.writeToYmlSettingsFile(EsYmlSettings.java:53)
        at org.sonar.application.ProcessLauncherImpl.writeConfFiles(ProcessLauncherImpl.java:152)
        at org.sonar.application.ProcessLauncherImpl.launch(ProcessLauncherImpl.java:84)
        at org.sonar.application.SchedulerImpl.lambda$tryToStartProcess$2(SchedulerImpl.java:192)

Hello @jvishnu066,

From

java.lang.IllegalStateException: Cannot write Elasticsearch yml settings file

I would invite you to make sure SonarQube process can write to the temp directory you specified. See sonar.path.temp property in sonar.properties.

Fixed that issue, please ignore the previous logs… !
Here are the latest logs.

i am using `1. Mircrosoft SQL server  2. sonar web.host: --> used my server ip & port 8080, at this time no other process is being using that port & using default -> /usr/lib/jvm/java-11-openjdk-11.0.3.7-0.el7_6.x86_64 ...

Hi @jvishnu066,

If you have a new error. Close this thread as fixed, create a new one and remove the useless posts from this thread.

Also, please clean up your post. If it’s so hard to read, you have little chance to get help.

Some suggestions:

  • use formatting options
  • if you already know what the error is, state it in your post
  • if they do not bring any value, do not enable DEBUG logs
  • share what investigations you made yourself and what you tried

Cheers,

Hello. Could you explain how you solved this? I have exactly the same problem.

I had a similar issue today, and I believe it was caused by running ./sonar.sh with sudo (usually I didn’t use sudo, but I wanted to try with port 80 with is restricted in my linux distro), which changed permissions of files within the /temp directory
In my case, I renamed the existing temp directory, and could successfully start the process again.
So in a nutshell, it was caused by not having permission to write to files within the temp folder.

1 Like

This solved my problem.

I updated my sonar from 7.2 to 7.9 and having exact same problem. Attaching sonar.logssonar.log (250.1 KB)

Hi @vmadaan,

It’s not clear to me what problem you’re having. After you check your server logs (you’ll find that turning off TRACE logging makes that easier to do) if you still have a problem please open a new thread with the details of what you’re experiencing.

 
Ann