Migrate sonarqube history from one server to another

Hello everyone,

I’ve seen that it is a very common topic but perhaps, it might help me seen new answers about the subject
I have deployed one sonarqube server with a postgresql database in an openshift 3 cluster. The versions that i use are respectively 8.4.2-community and 9.5 for my database.

I am currently in the process of migrating my project in a new openshift 4 cluster. I have deployed a new sonarqube server with a postgresql db with the exact same exact versions that i used before. Now is the step where i have to migrate the project history of the old sonar to the new one.

  1. i’ve seen the Sonar DB tool , but is it going migrate the project history ? or only the plugins added in my old sonar instance ? also since i am using a community version of sonar do i have access to his service ?
    Moreover , since i have deployed it in an openshift cluster, the steps of the db tool might be different.
    open to solutions for those who want to help

  2. Is there a way to migrate the data manually ? or a way to specificaly copy an paste the project history data ? and finaly where is the project history data located ?

what i did tried at first was to dump and dump restore my database from the old to the new one, but my project history was not migrated only few thing got transfered.
Perhaps i need to aim for a specific directory or transfer plugins as well ?

Must-share information (formatted with Markdown):

  • which versions are you using (SonarQube, Scanner, Plugin, and any relevant extension)
  • what are you trying to achieve
  • what have you tried so far to achieve this

Hi Jason, welcome to the SonarSource Community!

First things first, this should have worked. I am not sure why it did not in your case, but a full migration of all data should have preserved, well, all the data. :slight_smile: I’ve done this successfully quite a few times using pg_dump for the export and psql for the import. Maybe if you’d started up the new server without that data, you could already have some info in the SonarQube Elasticsearch indexes that’s stale? I’d try again, this time first deleting the contents of data/es6 on the new SonarQube server so the indexes are recreated at next startup.

We have a feature for this called Project Move, but it’s only available in our Enterprise Edition.

Hi Jeff ,

Thank you a lot for your reactivity and quick answer !

Concerning the dump , is the dump and dump restore from one database to another sufficient ? or maybe I need to manually transfer plugins from the old sonarqube server to the new one as well ? (since I do have quite a lot more plugins in the old ones)

I’d certainly suggest migrating the plugins along as well.

I did try a new dump with plugins migrations , it seems like my database can not restore all the datas because of some permission issues here is an extract of my logs

ERROR:  multiple primary keys for table "active_rule_parameters" are not allowed
STATEMENT:  ALTER TABLE ONLY public.active_rule_parameters
	    ADD CONSTRAINT pk_active_rule_parameters PRIMARY KEY (uuid);
	
	
	
ERROR:  multiple primary keys for table "active_rules" are not allowed
STATEMENT:  ALTER TABLE ONLY public.active_rules
	    ADD CONSTRAINT pk_active_rules PRIMARY KEY (uuid);
	
	
	
ERROR:  multiple primary keys for table "alm_app_installs" are not allowed
STATEMENT:  ALTER TABLE ONLY public.alm_app_installs
	    ADD CONSTRAINT pk_alm_app_installs PRIMARY KEY (uuid);

Or case of already existing data

ERROR:  relation "issues" already exists
STATEMENT:  CREATE TABLE public.issues (
	    kee character varying(50) NOT NULL,
	    rule_uuid character varying(40),
	    severity character varying(10),
	    manual_severity boolean NOT NULL,
	    message character varying(4000),
	    line integer,
	    gap numeric(30,20),
	    status character varying(20),
	    resolution character varying(20),
	    checksum character varying(1000),
	    reporter character varying(255),
	    assignee character varying(255),
	    author_login character varying(255),
	    action_plan_key character varying(50),
	    issue_attributes character varying(4000),
	    effort integer,
	    created_at bigint,
	    updated_at bigint,
	    issue_creation_date bigint,
	    issue_update_date bigint,
	    issue_close_date bigint,
	    tags character varying(4000),
	    component_uuid character varying(50),
	    project_uuid character varying(50),
	    locations bytea,
	    issue_type smallint,
	    from_hotspot boolean
	);

Also i got some “incomplete packet message” from the database as well but might not be directly related to the issue

That leads to the postgresql going in recovery mode
FATAL: the database system is in recovery mode

any idea how to overpass that ?

It seems like you already have a database on the destination PostgreSQL instance. Start fresh with nothing yet in place.