Trouble upgrading to 7.9.2

I’m following this guide to upgrade to SonarQube 7.9.2 on Linux. My previous version was 7.1, and used a MySQL database, and that worked fine. I’ve switched to Postgres for this and I get the following errors.

/logs/web.log:

2020.01.22 12:15:05 INFO  web[][o.s.s.p.d.m.h.MigrationHistoryTableImpl] Creating table schema_migrations
2020.01.22 12:15:05 ERROR web[][o.s.s.p.Platform] Web server startup failed
java.lang.IllegalStateException: Failed to create table schema_migrations
        at org.sonar.server.platform.db.migration.history.MigrationHistoryTableImpl.start(MigrationHistoryTableImpl.java:48)
        at java.base/java.util.Optional.ifPresent(Optional.java:183)
        at org.sonar.server.platform.platformlevel.PlatformLevel2.start(PlatformLevel2.java:107)
        at org.sonar.server.platform.Platform.start(Platform.java:211)
        at org.sonar.server.platform.Platform.startLevel2Container(Platform.java:177)
        at org.sonar.server.platform.Platform.init(Platform.java:87)
        at org.sonar.server.platform.web.PlatformServletContextListener.contextInitialized(PlatformServletContextListener.java:43)
        at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4817)
        at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5283)
        at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
        at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1423)
        at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1413)
        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: org.postgresql.util.PSQLException: ERROR: no schema has been selected to create in
  Position: 14
        at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2440)
        at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2183)
        at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:308)
        at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441)
        at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
        at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:307)
        at org.postgresql.jdbc.PgStatement.executeCachedSql(PgStatement.java:293)
        at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:270)
        at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:266)
        at org.apache.commons.dbcp2.DelegatingStatement.execute(DelegatingStatement.java:175)
        at org.apache.commons.dbcp2.DelegatingStatement.execute(DelegatingStatement.java:175)
        at org.sonar.server.platform.db.migration.history.MigrationHistoryTableImpl.execute(MigrationHistoryTableImpl.java:71)
        at org.sonar.server.platform.db.migration.history.MigrationHistoryTableImpl.createTable(MigrationHistoryTableImpl.java:59)
        at org.sonar.server.platform.db.migration.history.MigrationHistoryTableImpl.start(MigrationHistoryTableImpl.java:45)
        ... 15 common frames omitted
2020.01.22 12:15:06 WARN  web[][o.a.c.u.SessionIdGeneratorBase] Creation of SecureRandom instance for session ID generation using [SHA1PRNG] took [546] milliseconds.
2020.01.22 12:15:06 DEBUG web[][o.s.s.a.TomcatAccessLog] Tomcat is started
2020.01.22 12:15:06 INFO  web[][o.s.s.a.EmbeddedTomcat] HTTP connector enabled on port 9000
2020.01.22 12:15:06 INFO  web[][o.s.p.ProcessEntryPoint] Hard stopping process
2020.01.22 12:15:06 DEBUG web[][o.s.s.a.TomcatAccessLog] Tomcat is stopped


/logs/sonar.log:
> Blockquote
2020.01.22 12:14:53 DEBUG app[][o.e.c.t.TransportClientNodesService] failed to connect to node [{#transport#-1}{Axxt5FnNRMOy904Wr1blZQ}{127.0.0.1}{127.0.0.1:9001}], ignoring...
org.elasticsearch.transport.ConnectTransportException: [][127.0.0.1:9001] connect_exception
        at org.elasticsearch.transport.TcpTransport$ChannelsConnectedListener.onFailure(TcpTransport.java:1309)
        at org.elasticsearch.action.ActionListener.lambda$toBiConsumer$2(ActionListener.java:100)
        at org.elasticsearch.common.concurrent.CompletableContext.lambda$addListener$0(CompletableContext.java:42)
        at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
        at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
        at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
        at java.base/java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:2088)
        at org.elasticsearch.common.concurrent.CompletableContext.completeExceptionally(CompletableContext.java:57)
        at org.elasticsearch.transport.netty4.Netty4TcpChannel.lambda$new$1(Netty4TcpChannel.java:72)
        at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:511)
        at io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:504)
        at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:483)
        at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:424)
        at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:121)
        at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.fulfillConnectPromise(AbstractNioChannel.java:327)
        at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:343)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:591)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:508)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:470)
        at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:909)
        at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: /127.0.0.1:9001
        at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:779)
        at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:327)
        at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340)
        ... 6 common frames omitted
Caused by: java.net.ConnectException: Connection refused
        ... 10 common frames omitted
2020.01.22 12:14:53 DEBUG app[][o.s.a.e.EsConnectorImpl] Connected to Elasticsearch node: [127.0.0.1:9001]
2020.01.22 12:14:58 DEBUG app[][i.n.b.AbstractByteBuf] -Dio.netty.buffer.checkAccessible: true
2020.01.22 12:14:58 DEBUG app[][i.n.b.AbstractByteBuf] -Dio.netty.buffer.checkBounds: true
2020.01.22 12:14:58 DEBUG app[][i.n.u.ResourceLeakDetectorFactory] Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@564c2a71
2020.01.22 12:14:58 DEBUG app[][i.n.util.Recycler] -Dio.netty.recycler.maxCapacityPerThread: 4096
2020.01.22 12:14:58 DEBUG app[][i.n.util.Recycler] -Dio.netty.recycler.maxSharedCapacityFactor: 2
2020.01.22 12:14:58 DEBUG app[][i.n.util.Recycler] -Dio.netty.recycler.linkCapacity: 16
2020.01.22 12:14:58 DEBUG app[][i.n.util.Recycler] -Dio.netty.recycler.ratio: 8
2020.01.22 12:14:58 DEBUG app[][o.e.t.ConnectionManager] connected to node [{sonarqube}{-XfviAAkS8ajCuSgCkrINQ}{910FxiHSTgSI29LP2oO5_w}{127.0.0.1}{127.0.0.1:9001}{rack_id=sonarqube}]
2020.01.22 12:15:00 INFO  app[][o.s.a.SchedulerImpl] Process[es] is up
2020.01.22 12:15:00 INFO  app[][o.s.a.ProcessLauncherImpl] Launch process[[key='web', ipcIndex=2, logFilenamePrefix=web]] from [/opt/sonarqube]: /usr/lib/jvm/java-11-openjdk-11.0.6.10-1.0.1.el7_7.x86_64/bin/java -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djava.io.tmpdir=/opt/sonarqube/temp --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.rmi/sun.rmi.transport=ALL-UNNAMED -Xmx512m -Xms128m -XX:+HeapDumpOnOutOfMemoryError -Dhttp.proxyHost=http://proxy.intra.bt.com -Dhttp.proxyPort=8080 -Dhttp.nonProxyHosts=localhost|127.*|[::1] -Dhttps.proxyHost=http://proxy.intra.bt.com -Dhttps.proxyPort=8080 -cp ./lib/common/*:/opt/sonarqube/lib/jdbc/postgresql/postgresql-42.2.5.jar org.sonar.server.app.WebServer /opt/sonarqube/temp/sq-process4269030973614447572properties
2020.01.22 12:15:06 DEBUG app[][o.s.a.p.AbstractManagedProcess] Process exited with exit value [web]: 0
2020.01.22 12:15:06 INFO  app[][o.s.a.SchedulerImpl] Process[web] is stopped
2020.01.22 12:15:07 INFO  app[][o.s.a.SchedulerImpl] Process[es] is stopped
2020.01.22 12:15:07 WARN  app[][o.s.a.p.AbstractManagedProcess] Process exited with exit value [es]: 143
2020.01.22 12:15:07 INFO  app[][o.s.a.SchedulerImpl] SonarQube is stopped
2020.01.22 12:15:07 DEBUG app[][o.s.a.SchedulerImpl] Stopping [ce]...
2020.01.22 12:15:07 DEBUG app[][o.s.a.SchedulerImpl] Stopping [web]...
2020.01.22 12:15:07 DEBUG app[][o.s.a.SchedulerImpl] Stopping [es]...

Any ideas on this. Thanks.

I’ve managed to fix the error in web.log by changing the sonar.jdbc.url in /conf/sonar.properties to “sonar.jdbc.url=jdbc:postgresql://localhost/sonarqube”.

Now SonarQube doesn’t stop running, but nothing appears on localhost:9000.

The errors in the web.log about the schema were caused because I had uncommented out the sonar.jdbc.url from the /conf/sonar.properties and not changed the path. I had to remove the schema value and replace sonarqube with my database name.

From:

Blockquote
sonar.jdbc.url=jdbc:postgresql://localhost/sonarqube?currentSchema=my_schema

To:

Blockquote
sonar.jdbc.url=jdbc:postgresql://localhost/DataBaseName