SonarQube Database size is huge

I am using SonarQube Community Edition Version 8.9.8 (build 54436).

  • I am having an issue that the database is getting bigger and bigger, so I want to Reduce the size of the database
  • I tried to delete old projects. However, this didn’t help because the cleanup works only if I have an analysis. Unfortunately, I don’t have the old projects anymore, and I have a large number of these projects residing on the server.
    Is there a way to clean up the DB manually or use API calls?


Project deletion does purge data. So I’m not sure to understand this “However, this didn’t help because the cleanup works only if I have an analysis”.

What database are you using? (Postgres, SQL Server, Oracle). Which version?

How big is your database (in Mb/Gb)? How many projects do you have? How many issues?

Hi Pierre,
Thank you for the reply.

I researched when database clean-up happens, and I found most of the answers are like this in the snapshot:

from this link DB Housekeeping

So, according to you, I can delete all unwanted projects using SonarQube web UI, purging all that project’s data from the database tables.

We use SQL server With database version 13.00.7016 and Microsoft JDBC Driver 9.2 for SQL Server version

The database size is approximately 870GB. And we have a lot of old projects (many years) that we don’t need, and I can say hundreds of them.

So I am planning to write a script that will use APIs to delete all projects that are not needed anymore. Is this going to reduce the database size?

Thanks for getting back with more details :+1:

You can do this at any time indeed to clean up things, indeed. Everything related to the projects will be deleted.

This is a tricky question, as it depends mostly on how your data are structured. One some extreme situations, you could have one project with 1 million branches, or one project with very large files, or one issue with 2 billion changes.

Could you please have a look at your SonarQube database, and share with me the sizes (in Mb/Gb) of the different tables? I’m specifically interested in the issues, live_measures, issue_changes, file_sources tables.

data space = 9,171.039 MB
index space = 10,576.789 MB
Row count = 11163539
data space = 300,516.914 MB
index space = 260,996.336 MB
Row count = 817288712
data space = 2,263.515 MB
index space = 2,221.664 MB
Row count = 5096153
data space = 234,694.703 MB
index space = 9,316.258 MB
Row count = 21441585

If I delete the old projects, the tables’ size is reduced significantly. But the disk space is not freed. Should I use the shrink database command to free up disk space?

Thanks for sharing the data. I don’t see any unusual pattern, so cleaning “big” projects should clear some space as expected.

I think so, yes. If you have a test/staging environment, make sure you give it a try first, as it’s a really heavy operation that will impact SQ performances. Also, Shrink operations are known to cause disk fragmentation, which is something to take into consideration depending on your hardware.

I thought it will cause only index fragments, and that can be solved by reindixing. Please help me understand how to fix disk fragmentation when it happens. Are there links for such an issue?

I already did shrink the db and that gave back the free disk space. Do I need to worry about the performance?

If the performance is an issue with db shrink operation, how to solve the disk space without shrinking the database?

It really depends on your hardware. With a HDD you would need to run a defragmentation command, and on a SSD you should be ok.

You don’t have to, but you can monitor it closely if you feel the need.

Shrinking the database is the only way I know to effectively free the disk space with SQL Server. And the fragmentation is manageable, so you can end up in a stable situation.

We are getting a bit out of the scope of SonarQube here. If you have DBA or database expert in your company, I suggest you reach out to them to get help on database management :slight_smile:

1 Like

That make sense.

I really appreciate all your help.
Thank you very much.