Extremely slow analysis


I’m currently running sonarqube CE in version 8.9.5, and I sometimes have extremely slow analysis.
I managed to extract those execution time in ce.log

Ligne   39: 2022.09.13 09:30:56 INFO  ce[AYM2CUAYbSX6o6UkLjpA][o.s.c.t.s.ComputationStepExecutor] Persist live measures | insertsOrUpdates=412570 | status=SUCCESS | time=1175250ms
	Ligne   50: 2022.09.13 10:43:08 INFO  ce[AYM2CUAYbSX6o6UkLjpA][o.s.c.t.s.ComputationStepExecutor] Purge db | status=SUCCESS | time=4195068ms
	Ligne 1020: 2022.09.13 10:48:20 INFO  ce[AYM2CUAYbSX6o6UkLjpA][o.s.c.t.CeWorkerImpl] Executed task | project=legacy:module:my-project-name | type=REPORT | branch=FEATURE-VIA-14828-3 | branchType=BRANCH | id=AYM2CUAYbSX6o6UkLjpA | submitter=gitlab_integration | status=SUCCESS | time=6942242ms

I contacted my sqlserver dba which tells me that there are sonarqube requests that often last more than 10 seconds, which not very good, but not that bad unless they are run tons of times like the log above could reveal.

What can I do to improve such slow executions time ?
thank you.

Just in case, I give the whole execution timeline for this analysis

Extract report | status=SUCCESS | time=9811ms
Persist scanner context | status=SUCCESS | time=50ms
Propagate analysis warnings from scanner report | status=SUCCESS | time=45ms
Generate analysis UUID | status=SUCCESS | time=0ms
Load analysis metadata | status=SUCCESS | time=71ms
Initialize | status=SUCCESS | time=0ms
Build tree of components | components=10799 | status=SUCCESS | time=930ms
Validate project | status=SUCCESS | time=70ms
Load quality profiles | status=SUCCESS | time=222ms
Load Quality gate | status=SUCCESS | time=26ms
Load new code period | status=SUCCESS | time=25ms
Detect file moves | reportFiles=9425 | dbFiles=9425 | addedFiles=0 | status=SUCCESS | time=487ms
Load duplications | duplications=14903 | status=SUCCESS | time=594ms
Compute cross project duplications | status=SUCCESS | time=0ms
Compute size measures | status=SUCCESS | time=798ms
Compute new coverage | status=SUCCESS | time=452616ms
Compute coverage measures | status=SUCCESS | time=895ms
Compute comment measures | status=SUCCESS | time=28ms
Copy custom measures | status=SUCCESS | time=30ms
Compute duplication measures | status=SUCCESS | time=51ms
Compute size measures on new code | status=SUCCESS | time=34ms
Compute language distribution | status=SUCCESS | time=18ms
Compute test measures | status=SUCCESS | time=15ms
Compute complexity measures | status=SUCCESS | time=32ms
Load measure computers | status=SUCCESS | time=0ms
Compute Quality Profile status | status=SUCCESS | time=37ms
Execute component visitors | status=SUCCESS | time=650754ms
Checks executed after computation of measures | status=SUCCESS | time=0ms
Compute Quality Gate measures | status=SUCCESS | time=44ms
Compute Quality profile measures | status=SUCCESS | time=4ms
Generate Quality profile events | status=SUCCESS | time=31ms
Generate Quality gate events | status=SUCCESS | time=118ms
Check upgrade possibility for not analyzed code files. | status=SUCCESS | time=0ms
Persist components | status=SUCCESS | time=709ms
Persist analysis | status=SUCCESS | time=172ms
Persist analysis properties | status=SUCCESS | time=290ms
Persist measures | inserts=61 | status=SUCCESS | time=3412ms
Persist live measures | insertsOrUpdates=412570 | status=SUCCESS | time=1175250ms
Persist duplication data | insertsOrUpdates=3030 | status=SUCCESS | time=111970ms
Persist new ad hoc Rules | status=SUCCESS | time=0ms
Persist issues | cacheSize=328 KB | inserts=6 | updates=554 | merged=0 | status=SUCCESS | time=9569ms
Persist project links | status=SUCCESS | time=0ms
Persist events | status=SUCCESS | time=504ms
Persist sources | status=SUCCESS | time=14070ms
Persist cross project duplications | status=SUCCESS | time=0ms
Enable analysis | status=SUCCESS | time=320ms
Update last usage date of quality profiles | status=SUCCESS | time=411ms
time=17ms | sql=select p.prop_key as "key", p.is_empty as empty, p.text_value as textValue, p.clob_value as clobValue, p.component_uuid as componentUuid, p.user_uuid as userUuid from properties p where p.prop_key=? and p.component_uuid is null and p.user_uuid is null | params=sonar.showProfiling
Purge db | status=SUCCESS | time=4195068ms
time=13ms | sql=SELECT p.uuid, p.kee, p.name, s.created_at, p.tags, p.qualifier FROM projects p LEFT OUTER JOIN snapshots s ON s.component_uuid=p.uuid AND s.islast=? WHERE p.qualifier in (?, ?) AND p.uuid=? | params=true, TRK, APP, AYM2BLmXbSX6o6UkLjo8
ES refresh request on indices 'projectmeasures' | time=25ms
time=1560ms | sql=select p.uuid as uuid, p.uuid_path as uuidPath, p.project_uuid as projectUuid, p.module_uuid as moduleUuid, p.module_uuid_path as moduleUuidPath, p.main_branch_project_uuid as mainBranchProjectUuid, p.kee as kee, p.name as name, p.long_name as longName, p.description as description, p.qualifier as qualifier, p.scope as scope, p.language as language, p.root_uuid as rootUuid, p.path as path, p.enabled as enabled, p.copy_component_uuid as copyComponentUuid, p.private as isPrivate, p.created_at as createdAt from components p where p.enabled=1 and p.copy_component_uuid is null and p.main_branch_project_uuid is null and p.scope = 'PRJ' and p.qualifier in ('TRK','VW','SVW','APP') and p.project_uuid = ? | params=AYM2BLmXbSX6o6UkLjo8
ES refresh request on indices 'components' | time=6ms
time=61644ms | sql=select i.kee,i.assignee,i.line,i.resolution,i.severity,i.status,i.effort,i.author_login,i.issue_close_date,i.issue_creation_date,i.issue_update_date,r.uuid,r.language,c.uuid,c.module_uuid_path,c.path,c.scope,c.project_uuid,c.main_branch_project_uuid,i.tags,i.issue_type,r.security_standards,c.qualifier from issues i inner join rules r on r.uuid = i.rule_uuid inner join components c on c.uuid = i.component_uuid and c.project_uuid = ? and i.project_uuid = ?  | params=AYM2BLmXbSX6o6UkLjo8, AYM2BLmXbSX6o6UkLjo8
ES refresh request on indices 'issues' | time=1226ms
Index analysis | status=SUCCESS | time=89329ms
time=8ms | sql=update project_branches set need_issue_sync = ?, updated_at = ? where uuid = ? | params=false, 1663065878134, AYM2BLmXbSX6o6UkLjo8
Update need issue sync for branch | status=SUCCESS | time=629ms
time=28ms | sql=SELECT count(1) FROM properties pp left outer join components pj on pp.component_uuid = pj.uuid where pp.user_uuid is not null and (pp.component_uuid is null or pj.uuid=?) and (pp.prop_key like ? or pp.prop_key like ? or pp.prop_key like ? or pp.prop_key like ?) | params=AX6-6hQsppfvE64_SHmb, notification.NewIssues.%, notification.NewFalsePositiveIssue.%, notification.SQ-MyNewIssues.%, notification.ChangesOnMyIssue.%
time=180ms | sql=SELECT u.uuid as uuid, u.login as login, u.name as name, u.email as email, u.active as "active", u.scm_accounts as "scmAccounts", u.salt as "salt", u.crypted_password as "cryptedPassword", u.hash_method as "hashMethod", u.external_id as "externalId", u.external_login as "externalLogin", u.external_identity_provider as "externalIdentityProvider", u.user_local as "local", u.is_root as "root", u.onboarded as "onboarded", u.reset_password as "resetPassword", u.homepage_type as "homepageType", u.homepage_parameter as "homepageParameter", u.last_connection_date as "lastConnectionDate", u.last_sonarlint_connection as "lastSonarlintConnectionDate", u.created_at as "createdAt", u.updated_at as "updatedAt" FROM users u WHERE u.uuid in ( ? , ? , ? , ? ) | params=AX6w8Jo-FCelE9T__prn, AX6wcdXsFCelE9T__prk, AX_UsPXEquyHVjbuQw_k, AYGpcnujKqdpu3J1DRku
time=13ms | sql=SELECT u.login as "login", 1 as "global", u.email as "email" FROM users u INNER JOIN properties p ON p.user_uuid = u.uuid and p.prop_key = ? and p.text_value = 'true' and p.component_uuid IS NULL WHERE u.email is not null UNION SELECT u.login as "login", 0 as "global", u.email as "email" FROM users u INNER JOIN components c on c.kee = ? INNER JOIN properties p ON p.user_uuid = u.uuid and p.prop_key = ? and p.text_value = 'true' and p.component_uuid = c.uuid WHERE u.email is not null | params=notification.NewIssues.EmailNotificationChannel, legacy:module:my-project-name, notification.NewIssues.EmailNotificationChannel
Send issue notifications | newIssuesNotifs=1 | newIssuesDeliveries=0 | myNewIssuesNotifs=0 | myNewIssuesDeliveries=0 | changesNotifs=1 | changesDeliveries=0 | status=SUCCESS | time=1260ms
Publish task results | status=SUCCESS | time=0ms
Trigger refresh of Portfolios and Applications | status=SUCCESS | time=0ms
time=8ms | sql=SELECT p.uuid as uuid, p.kee as kee, p.qualifier as qualifier, p.name as name, p.description as description, p.tags as tagsString, p.private as isPrivate, p.created_at as createdAt, p.updated_at as updatedAt FROM projects p where p.uuid=? | params=AX6-6hQsppfvE64_SHmb
time=4ms | sql=select uuid, name, url, project_uuid as projectUuid, secret, created_at as createdAt, updated_at as updatedAt from webhooks where project_uuid = ? order by name asc | params=AX6-6hQsppfvE64_SHmb
time=8ms | sql=select uuid, name, url, project_uuid as projectUuid, secret, created_at as createdAt, updated_at as updatedAt from webhooks where project_uuid is null order by name asc
Webhooks | globalWebhooks=0 | projectWebhooks=0 | status=SUCCESS | time=170ms
Pull Request Decoration | status=SUCCESS | time=0ms
time=8ms | sql=select cq.uuid, cq.task_type as taskType, cq.component_uuid as componentUuid, cq.main_component_uuid as mainComponentUuid, cq.status as status, cq.submitter_uuid as submitterUuid, cq.worker_uuid as workerUuid, cq.started_at as startedAt, cq.created_at as createdAt, cq.updated_at as updatedAt from ce_queue cq where cq.uuid=? | params=AYM2CUAYbSX6o6UkLjpA
time=349ms | sql=update ce_activity set is_last=0, updated_at=? where is_last=1 and is_last_key=? | params=1663065887332, REPORTAYM2BLmXbSX6o6UkLjo8
time=42ms | sql=update ce_activity set main_is_last=0, updated_at=? where main_is_last=1 and main_is_last_key=? | params=1663065887332, REPORTAX6-6hQsppfvE64_SHmb
time=184ms | sql=insert into ce_activity ( uuid, component_uuid, main_component_uuid, analysis_uuid, status, task_type, is_last, is_last_key, main_is_last, main_is_last_key, submitter_uuid, submitted_at, worker_uuid, execution_count, started_at, executed_at, created_at, updated_at, execution_time_ms, error_message, error_stacktrace, error_type ) values ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, 0, ?, ?, ?, ?, ?, ?, ?, ? ) | params=AYM2CUAYbSX6o6UkLjpA, AYM2BLmXbSX6o6UkLjo8, AX6-6hQsppfvE64_SHmb, AYM2DQ2hQQsOMcHf_c9_, SUCCESS, REPORT, true, REPORTAYM2BLmXbSX6o6UkLjo8, true, REPORTAX6-6hQsppfvE64_SHmb, AX6wOXdZFCelE9T__prB, 1663058920054, AYKIRPqiQQsOMcHfBPFO, 1663059158705, 1663065887332, 1663065887332, 1663065887332, 6728627, 12, 2005, 12
time=212838ms | sql=delete from ce_task_input where task_uuid in ( ? ) | params=AYM2CUAYbSX6o6UkLjpA
time=16ms | sql=delete from ce_queue where uuid=? and status = ? | params=AYM2CUAYbSX6o6UkLjpA, IN_PROGRESS
Executed task | project=legacy:module:my-project-name | type=REPORT | branch=FEATURE-VIA-14828-3 | branchType=BRANCH | id=AYM2CUAYbSX6o6UkLjpA | submitter=gitlab_integration | status=SUCCESS | time=6942242ms
1 Like


I notice that persisting of duplication data also takes a while. How big is the code base in question?


The whole LOC of this sonarqube instance is 1,6M
this single project is 1,3M. This is of course the main one, that is the most analyzed in a day.

Here is a report from sql server for simulated
delete from live_measures where project_uuid = xxx which currently contains ~300k lines

This statement would last for 13s just for one delete. And in db purge or data persist, there are lots of delete before reinstert.

By the way, not related to the perf issue, I saw that the column data type of column project_uuid is nvarchar, which is quite ineffective in sqlserver. Nvarchar stands for uNicode characters, which is useless here because project_uuid only contains plain ASCII chars. So for each id, twice the size is used, because each char is stored in 2 bytes. This is for project_uuid, but all the others pk/fk might be in the same case.


For a 1.3M LOC project, I’m not surprised that server-side processing of the analysis report takes a while.

And I see that processing overall takes ~70 minutes and the live measures took ~20. It’s not clear to me that those numbers are unreasonable for a project of that size.

At the same time, we have made some improvements in the 9-series that would speed up analysis report processing, including:

  • SONAR-15321 Large number of entries in ‘issue_changes’
  • SONAR-17044 Optimize Compute Engine issue tracking and persisting of measures where file is marked as unchanged

Would it be possible for you to upgrade to 9.6? (I recognize that large instances are often stuck on the LTS by corporate policy.)

And thanks for the DB structure feedback. I’ll make sure the product team sees it.


I’ll try this as soon as possible.
I’ll give you feddback when possible

1 Like

Hey @Dominique_Jean-Prost ,

Indeed we are using NVARCHAR which could be ineffective in the case of UUID, though I’m unsure if that change of a data type would increase performance significantly.

Nevertheless, you are making a good point and I have created a ticket for that to be addressed in future versions of SQ: [SONAR-17525] - Jira.


1 Like