Sonarsource/sonarqube-scan-action@master failed to install sonar-scanner-cli

Hi Support team.
We are currently experiencing an issue after upgrading from sonarcloud-github-action to sonarqube-scan-action. We receive the error mv: cannot overwrite ‘/home/runner/work/_temp/sonar-scanner-cli-6.2.1.4610-Linux-X64/sonar-scanner-6.2.1.4610-linux-x64’: Directory not empty when the sonar-scanner-cli cache cannot be found and needs to be reinstalled in the self-hosted runner. could you please let me know why we are seeing this issue and how to fix this ? is it doable to remove /home/runner/work/_temp/sonar-scanner-cli-6.2.1.4610-Linux-X64 before running sonar scan, and will it cause any issue? We are using self-host runner that provisioned in AWS EC2
Logs:

Run sonarsource/sonarqube-scan-action@master

2 with:

3 projectBaseDir: .

4 args: -Dsonar.organization=aamc-org-github -Dsonar.sources=src -Dsonar.projectKey=aamc-org_ps-iam-oauth-client-secrets-nodejs-cdk -Dsonar.test.inclusions=“/testing/,/*.spec.ts,/*.test.ts” -Dsonar.javascript.lcov.reportPaths=./coverage/lcov.info -Dsonar.verbose=true -Dsonar.coverage.exclusions=“**/*spec.ts” -Dsonar.tests=src

5

6 scannerVersion: 6.2.1.4610

7 scannerBinariesUrl: https://binaries.sonarsource.com/Distribution/sonar-scanner-cli

8 env:

9 ACTIONS_ALLOW_USE_UNSECURE_NODE_VERSION: true

10 GITHUB_TOKEN: ***

11 SONAR_TOKEN: ***

12Run ${GITHUB_ACTION_PATH}/sanity-checks.sh

20Run actions/cache@v4.0.2

32Cache not found for input keys: sonar-scanner-cli-6.2.1.4610-Linux-X64

33Run ${GITHUB_ACTION_PATH}/install-sonar-scanner-cli.sh

42+ mkdir -p /home/runner/work/_temp/sonarscanner

43+ cd /home/runner/work/_temp/sonarscanner

44+ SCANNER_FILE_NAME=sonar-scanner-cli-6.2.1.4610-linux-x64.zip

45+ SCANNER_URI=https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-6.2.1.4610-linux-x64.zip

46+ command -v wget

47+ wget --no-verbose --user-agent=sonarqube-scan-action https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-6.2.1.4610-linux-x64.zip

482024-12-09 13:39:48 URL:https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-6.2.1.4610-linux-x64.zip [57996219/57996219] → “sonar-scanner-cli-6.2.1.4610-linux-x64.zip” [1]

49+ unzip -q sonar-scanner-cli-6.2.1.4610-linux-x64.zip

50+ mv sonar-scanner-6.2.1.4610-linux-x64 /home/runner/work/_temp/sonar-scanner-cli-6.2.1.4610-Linux-X64

51mv: cannot overwrite ‘/home/runner/work/_temp/sonar-scanner-cli-6.2.1.4610-Linux-X64/sonar-scanner-6.2.1.4610-linux-x64’: Directory not empty

1 Like

We’re also having this issue and it’s blocking our pipeline. Any help would be appreciated.

Hi @fjiang-aamc and @ianpogi5,

Thanks for your report and apologies for the issue caused by this update.

If the update is blocking your pipeline, as an immediate action, you can fix the version of the action using sonarcloud-github-action@v3.1.0, which, while deprecated, is still fully functional. That should unblock you right away.

In general, we suggest always using a fixed version of the GitHub action instead of pointing to master to avoid pipeline outages in case of issues like this one. See this post, as well as the “Upgrade when using the master version of the action” section of this post, for more details.

Let me know if that unlocks you.

In the meantime I will try to understand how you end up with a non-empty scanner folder, but with a cache miss.

Best regards,
Antonio

hi Antonio
Thank you for your quick reply, switched back to sonarcloud action fixed this issue, and we are currently monitoring our all workflows.
We have 12 self-runners, and jobs will be assigned randomly. How does Sonar action manage the cache? After a successful run, is the cache cleaned up or retained on the runner, and when is it cleaned up? any difference between the new one and old one? thank you

Hi antonio.aversa we were using sonarqube-scan-action@v4.1.0 when we got the issue. We reverted back to sonarcloud-github-action@v3.1.0 and now it’s working again but have the deprecated warning.

Hi @fjiang-aamc,

Thank you for your quick reply, switched back to sonarcloud action fixed this issue, and we are currently monitoring our all workflows.

That’s good news. sonarcloud-github-action is deprecated, but still fully functional. You can stay on v3.1.0 until we sort out this problem with the cache.

We have 12 self-runners, and jobs will be assigned randomly. How does Sonar action manage the cache? After a successful run, is the cache cleaned up or retained on the runner, and when is it cleaned up?

The GitHub cache is managed via actions/cache@v4.0.2, which is restored before execution, written when steps.sonar-scanner-cli.outputs.cache-hit != 'true', and stored in Post <scan job>, but only on a cache miss.

Post job cleanup.
Post job cleanup.
/usr/bin/tar --posix -cf cache.tzst --exclude cache.tzst -P -C /home/runner/work/dart-tools-test1/dart-tools-test1 --files-from manifest.txt --use-compress-program zstdmt
Cache Size: ~50 MB (52600382 B)
Cache saved successfully
Cache saved with key: sonar-scanner-cli-6.2.1.4610-Linux-X64

When there is a cache hit, you should see a message like this in the logs of your job:

Post job cleanup.
Post job cleanup.
Cache hit occurred on the primary key sonar-scanner-cli-6.2.1.4610-Linux-X64, not saving cache.

any difference between the new one and old one?

Yes, there is a fundamental difference between v3 and v4 (both for sonarqube-scan-action and sonarcloud-github-action: @v3 and below were based on Docker, and the Sonar Scanner CLI was baked in the sonar-scanner-cli-docker image, used as a base for the Docker image of the action.

v4 removes the Docker dependency and introduces a dependency on actions/cache, which is used to store the Sonar Scanner CLI in the GitHub cache, and avoid download it at every execution of the pipeline. So the actions/cache dependency is now needed.

Could you share your log of the Post <scan job>? I am trying to understand if, after a cache miss, there is anything preventing storing the cache with your setup. I think it may be related to how the cache is dealt with self-hosted runners.

It may be that RUNNER_TEMP, the temporary directory we use to store the Sonar Scanner CLI, is not cleaned across job executions.

Thanks,
Antonio

Hi @ianpogi5,

Yes, v3 is deprecated, but still functional. We are keeping it up to date with bug fixes and security patches until v4 is up to speed.

As for @fjiang-aamc, it would be useful if you could share your logs, in particular the ones related to action/cache. Part of those logs are in the main job, something like the following:

Run sonarsource/sonarqube-scan-action@v4
Run ${GITHUB_ACTION_PATH}/sanity-checks.sh
Run actions/cache@v4.0.2
<interesting part here>

The other interesting part is in the post job:

Post job cleanup.
Post job cleanup.
/usr/bin/tar --posix -cf cache.tzst --exclude cache.tzst -P -C /home/runner/work/dart-tools-test1/dart-tools-test1 --files-from manifest.txt --use-compress-program zstdmt
Cache Size: ~100 MB (105233834 B)
Cache saved successfully
Cache saved with key: sonar-scanner-cli-6.2.1.4610-Linux-X64

Thanks,
Antonio

5 posts were split to a new topic: Compressed symlink is not supported in master

hi Antonio
Below is the Post after a successful scan job
0s

1Post job cleanup.

2Post job cleanup.

3Cache hit occurred on the primary key sonar-scanner-cli-6.2.1.4610-Linux-X64, not saving cache.

A post was merged into an existing topic: Compressed symlink is not supported in master

Hi Antonio
another question is why is sonar cache so large that is being saved?

Hi @fjiang-aamc,

Has been the cache size growing, since the first few executions?
That may be an indication that the runner temporary directory has not been cleaned after the run.
To be on the safe side, we are going to check before running mv and clean if necessary. We are also going to add a warning, to be able to determine whether the temporary directory is indeed not empty on self-hosted runners.
We plan to do a new release early next week, and this change should be in it. I will ping you here once we have the new version.

Best regards,
Antonio

1 Like

Hi @fjiang-aamc,

We have just released a new version of the GitHub action - v4.2.
You can read more about the new release in this community announcement or on GitHub.

We now cleanup the temp runner before proceeding to the mv.

Let us know if that helps with your issue,
Thanks
Antonio

1 Like

Thank you! I’ll try this after Christmas as I don’t want to introduce new things while most people are on vacation :grinning:

2 Likes

Hi @fjiang-aamc,

All tests and user reports in the last month indicate that the problem should be solved.
So I am un-assigning myself from this issue.

Feel free to update the thread if you still encounter new issues with the latest release of sonarqube-scan-action (v4.2.1), or to create a new post in the community, if you reproduce any issue after this thread gets closed.

Best regards,
Antonio

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.