Must-share information (formatted with Markdown):
- which versions are you using: sonarqube-enterprise-10.1.0.73491.zip
- how is SonarQube deployed: zip
- java version: jdk-17.0.6_linux-x64_bin.rpm (oracle)
- OS Version: centos stream 9
- what are you trying to achieve: running
- what have you tried so far to achieve this: no
installed as documentation, start application by systemd, but elasticsearch is not able to start, turn on the debug to find the error: java.lang.UnsupportedOperationException: sun.misc.Unsafe unavailable, below is the error log part, how to resolve it, java version is not compatible with or OS is not compatible?
2023.07.31 17:59:32 DEBUG es[o.e.t.ThreadPool] created thread pool: name [snapshot], core [1], max [10], keep alive [5m]
2023.07.31 17:59:32 DEBUG es[o.e.t.ThreadPool] created thread pool: name [search_throttled], size [1], queue size [100]
2023.07.31 17:59:32 DEBUG es[i.n.u.i.l.InternalLoggerFactory] Using Log4J2 as the default logging framework
2023.07.31 17:59:32 DEBUG es[i.n.u.i.PlatformDependent0] -Dio.netty.noUnsafe: true
2023.07.31 17:59:32 DEBUG es[i.n.u.i.PlatformDependent0] sun.misc.Unsafe: unavailable (io.netty.noUnsafe)
2023.07.31 17:59:32 DEBUG es[i.n.u.i.PlatformDependent0] Java version: 17
2023.07.31 17:59:32 DEBUG es[i.n.u.i.PlatformDependent0] java.nio.DirectByteBuffer.(long, int): unavailable
2023.07.31 17:59:32 DEBUG es[i.n.u.i.PlatformDependent] maxDirectMemory: 2147483648 bytes (maybe)
2023.07.31 17:59:32 DEBUG es[i.n.u.i.PlatformDependent] -Dio.netty.tmpdir: /opt/sonarqube/temp (java.io.tmpdir)
2023.07.31 17:59:32 DEBUG es[i.n.u.i.PlatformDependent] -Dio.netty.bitMode: 64 (sun.arch.data.model)
2023.07.31 17:59:32 DEBUG es[i.n.u.i.PlatformDependent] -Dio.netty.maxDirectMemory: -1 bytes
2023.07.31 17:59:32 DEBUG es[i.n.u.i.PlatformDependent] -Dio.netty.uninitializedArrayAllocationThreshold: -1
2023.07.31 17:59:32 DEBUG es[i.n.u.i.CleanerJava9] java.nio.ByteBuffer.cleaner(): unavailable
java.lang.UnsupportedOperationException: sun.misc.Unsafe unavailable
at io.netty.util.internal.CleanerJava9.(CleanerJava9.java:68) ~[?:?]
at io.netty.util.internal.PlatformDependent.(PlatformDependent.java:193) ~[?:?]
at io.netty.util.ConstantPool.(ConstantPool.java:34) ~[?:?]
at io.netty.util.AttributeKey$1.(AttributeKey.java:27) ~[?:?]
at io.netty.util.AttributeKey.(AttributeKey.java:27) ~[?:?]
at org.elasticsearch.http.netty4.Netty4HttpServerTransport.(Netty4HttpServerTransport.java:329) ~[?:?]
at org.elasticsearch.transport.netty4.Netty4Plugin.getSettings(Netty4Plugin.java:46) ~[?:?]
at org.elasticsearch.plugins.PluginsService.lambda$flatMap$0(PluginsService.java:254) ~[elasticsearch-8.7.0.jar:?]
at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:273) ~[?:?]
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197) ~[?:?]
at java.util.AbstractList$RandomAccessSpliterator.forEachRemaining(AbstractList.java:720) ~[?:?]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:575) ~[?:?]
at java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260) ~[?:?]
at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:616) ~[?:?]
at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:622) ~[?:?]
at java.util.stream.ReferencePipeline.toList(ReferencePipeline.java:627) ~[?:?]
at org.elasticsearch.node.Node.(Node.java:447) ~[elasticsearch-8.7.0.jar:?]
at org.elasticsearch.node.Node.(Node.java:324) ~[elasticsearch-8.7.0.jar:?]
at org.elasticsearch.bootstrap.Elasticsearch$2.(Elasticsearch.java:216) ~[elasticsearch-8.7.0.jar:?]
at org.elasticsearch.bootstrap.Elasticsearch.initPhase3(Elasticsearch.java:216) ~[elasticsearch-8.7.0.jar:?]
at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:67) ~[elasticsearch-8.7.0.jar:?]
2023.07.31 17:59:32 DEBUG es[i.n.u.i.PlatformDependent] -Dio.netty.noPreferDirect: true
Do not share screenshots of logs – share the text itself (bonus points for being well-formatted)!