Sonar scan intermittently slow on "Detect file moves"

Hello Sonar Community,
We host a Sonarqube Community instance on v8.4.1.35646 for several customers, and lately one project’s scan will sometimes take up to 40 minutes running the “Detect file moves” step. We haven’t been able to identify why some of this customer’s scans run in 3 seconds, and others take much longer. I’ve attached the ce.log entries from a normal sonar scan and a slow sonar scan.
We checked RDS database health and did not see any outlier metrics. Hoping someone has ran into this before or can provide insight into the long running task
Detect file moves | reportFiles=44 | dbFiles=45 | addedFiles=1 | movedFiles=1 | status=SUCCESS | time=2364708ms
normal_sonar_scan.txt (18.3 KB)
slow_sonar_scan.txt (18.3 KB)


Preliminarily (and cleaning out what I guess is splunk stuff in the line) I see this:

2021.06.04 12:34:59 INFO ce[AXnXvKEqHUaH5OaviXD2][o.s.c.t.s.ComputationStepExecutor] Detect file moves | reportFiles=44 | dbFiles=45 | addedFiles=1 | movedFiles=1 | status=SUCCESS | time=2364708ms

2021.06.03 16:35:06 INFO ce[AXnTli2RHUaH5OaviWMJ][o.s.c.t.s.ComputationStepExecutor] Detect file moves | reportFiles=45 | dbFiles=45 | addedFiles=1 | movedFiles=0 | status=SUCCESS | time=7ms

So in the slow case, 1 file was moved. What, if anything can you tell us about that file? E.G. file size, original depth in tree vs new depth in tree, how “far” it was moved, …


Hey Ann,
Thanks for getting back. If I look at the commit related to that slow Sonar scan, the only changes to a file itself is a single file deletion; I do not see any files being moved. So maybe a deletion counts as a move? Is there a rule of thumb that deleting or moving a file in a codebase will cause the next Sonar scan to take considerably longer? Thank you for your help


No. I don’t think so.

Since I don’t know the intricacies of the interface you’re in, I’m going to ask the dumb question: are you sure that handler.js and consts.js existed in the analysis before this one? (I.e. that crime-batch-mocked-response.js wasn’t moved to one of these new names?)


Hi Ann, that’s correct- those listed files previously existed, and it was crime-batch-mocked-response.js that was removed. Is there anything we should be considering when doing move files? For example say there was a large file moved to a very deep depth. Are there any actions we can take there? Or is it just for awareness.
We worked with the customer to tweak their to be more accurate and tightly scoped, and we’re hoping we do not see this issue again.
Let us know if there’s anything else we can look into.
Thanks for your help,

Hi James,

I believe there can be issues with file move performance for very large files although I’m surprised that you’ve encountered a problem with deletion. I have to admit at this point to being out of my depth. I’ve flagged your post for more expert attention.


We can go ahead and close this ticket. The customer has made changes to their file to only include folders/files they really care about, and that’s seemed to help with task times by not looking at as many file moves. Thank you for taking a look