SonarQube 6.7 LTS to 7.6 migration: Execution of migration step #1907 'Populate table live_measures' failed

We have SonarQube Community 6.7.6 LTS with about 4600 projects (we scan a lot of branches). Developer edition provides better branching mechanism, so we decided to migrate to it. We got a trial version, installed SonarQube 7.6 and started an upgrade process. Unfortunately, after 3 days (yes, very long) the migration process hit the following issue:

2019.02.08 12:29:18 ERROR web[][o.s.s.p.d.m.DatabaseMigrationImpl] DB migration ended with an exception
org.sonar.server.platform.db.migration.step.MigrationStepExecutionException: Execution of migration step #1907 'Populate table live_measures' failed
		at org.sonar.server.platform.db.migration.step.MigrationStepsExecutorImpl.execute(
		at org.sonar.server.platform.db.migration.step.MigrationStepsExecutorImpl.execute(
		at java.lang.Iterable.forEach(
		at org.sonar.server.platform.db.migration.step.MigrationStepsExecutorImpl.execute(
		at org.sonar.server.platform.db.migration.engine.MigrationEngineImpl.execute(
		at org.sonar.server.platform.db.migration.DatabaseMigrationImpl.doUpgradeDb(
		at org.sonar.server.platform.db.migration.DatabaseMigrationImpl.doDatabaseMigration(
		at java.util.concurrent.ThreadPoolExecutor.runWorker(
		at java.util.concurrent.ThreadPoolExecutor$
Caused by: java.lang.IllegalStateException: Error during processing of row: [Unavailable: Bad format for BigDecimal ';112=2427;118=2427;119=2427;124=2427;129=809ûû123=0;12' in column 6.]
		at org.sonar.server.platform.db.migration.step.SelectImpl.newExceptionWithRowDetails(
		at org.sonar.server.platform.db.migration.step.SelectImpl.scroll(
		at org.sonar.server.platform.db.migration.step.MassUpdate.execute(
		at org.sonar.server.platform.db.migration.version.v70.PopulateLiveMeasures.execute(
		at org.sonar.server.platform.db.migration.step.DataChange.execute(
		at org.sonar.server.platform.db.migration.step.MigrationStepsExecutorImpl.execute(
		... 9 common frames omitted
Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Application was streaming results when the connection failed. Consider raising value of 'net_write_timeout' on the server.
		at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
		at sun.reflect.NativeConstructorAccessorImpl.newInstance(
		at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
		at java.lang.reflect.Constructor.newInstance(
		at com.mysql.jdbc.Util.handleNewInstance(
		at com.mysql.jdbc.SQLError.createCommunicationsException(
		at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(
		at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(
		at com.mysql.jdbc.MysqlIO.checkErrorPacket(
		at com.mysql.jdbc.MysqlIO.checkErrorPacket(
		at com.mysql.jdbc.MysqlIO.nextRow(
		at com.mysql.jdbc.RowDataDynamic.nextRecord(
		at org.sonar.server.platform.db.migration.step.SelectImpl.scroll(
		... 13 common frames omitted
Caused by: Connection reset
		at com.mysql.jdbc.util.ReadAheadInputStream.fill(
		at com.mysql.jdbc.util.ReadAheadInputStream.readFromUnderlyingStreamIfNecessary(
		at com.mysql.jdbc.MysqlIO.readFully(
		at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(
		... 23 common frames omitted

On the page SonarQube Update Database for Version 7 Fails somebody suggested to remove live_measures_component index before the database upgrade:

DROP INDEX "live_measures_component";
CREATE INDEX live_measures_component ON live_measures (component_uuid,metric_id);

Could somebody confirm that it is safe?


Hello Adam,

Unfortunately, after 3 days (yes, very long)

What DB engine are you using?

On the page SonarQube Update Database for Version 7 Fails somebody suggested to remove live_measures_component index before the database upgrade:

The advise shared there was not to completely remove the live_measures_component.
The response there says to first drop the index and then recreate it without unique constraint.
We do not authorize this db schema change at the moment - however it seems like we need to decide what to do with those kind of cases (probably caused by third party/community plugins).

There is also a SQL SELECT statement that may pretty much tell you how many records/projects are affected. Can you run it and share some feedback on the output ?
Here is the query:

SELECT p.uuid,, p.kee, p.created_at, p.long_name, p.language,  pm.metric_id, COUNT(1) 
FROM project_measures pm 
INNER JOIN projects p on p.uuid = pm.component_uuid 
INNER JOIN snapshots s on s.uuid = pm.analysis_uuid 
WHERE s.islast = TRUE and pm.person_id is null 
GROUP BY p.uuid,, p.kee, p.created_at, p.long_name, p.language,  pm.metric_id HAVING 
COUNT(1) > 1;

Awaiting your feedback.

Hello Krzysztof,
I apologize for delay, but I didn’t have access to our SonarQube instance for a while. The drop index solution does not work. Luckily our infrastructure team was able to “fix” the problem. They increased the amout of RAM memory (from 16 to 48 GB) and after that the upgrade process has been finished successfully.

At this moment we use MySQL database and the following community plugins:

I’ll ask our infrastructure team if they can execute the SQL query which you wrote.


Hi Adam,

I’m glad you got this worked out.

Tangentially, I feel the need to point out that SonarCSS is maintained. Also, badges are now natively available.


1 Like

Hi Ann,

Thank you, I didn’t know :slight_smile:

Yes, but it has less rules and forces us to install Node.js on CI (Node.js is a one big security hole).


A post was merged into an existing topic: Upgrade from 6.7.3 to 7.9 (LTS) -> Fail to execute CREATE UNIQUE INDEX live_measures_component