Version: sonarsource/sonarcloud-scan:1.0.1 (via BitBucket pipelines)
Error:
INFO: 21 source files to be analyzed
/usr/bin/run-scanner.sh: line 24: 11 Killed sonar-scanner “{ALL_ARGS[@]}" 2>&1
12 Done | tee " {SCANNER_REPORT}”
SonarCloud analysis failed.
Build C# project with bitbucket pipelines configured to reproduce. Error shows the session being killed causing it to fail.
Build Data.txt (4.8 KB)
Hello @technophobe ,
A few questions:
Did your scans succeed in the past?
Is this error consistently reproducable?
If it is reproducable: does it always end on the exact same last log message?
It seems that the scan was killed from the “outside”, this could happen because the scan was using up too much memory.
It seems to stop while analyzing XML files, do you happen to have any big XML files in your repository?
You could try to configure a memory limit for the scanner, see the -Xmx parameter in this post: Bitbucket memory limit exceeded .
It has never worked before
I can reproduce it each time. Tried 3-5x
It does seem to end in the same error message.
I do have a lot of XML files as part of my WPF/UI.
Memory has been changed using this value it still seems to crash.
janos
(Janos Gyerik)
July 17, 2020, 4:35pm
5
Can you please share the relevant lines from the pipeline configuration showing how you increased the memory?
janos
(Janos Gyerik)
July 31, 2020, 1:35pm
8
The technique you use for increasing memory for the pipe looks like as in this doc . I have a feeling this is specific for running docker
commands in pipelines, and I’m not sure it applies to pipes .
I’m wondering if the technique in this other doc might be more effective:
options:
size: 2x # all steps in this repo get 8GB memory
pipelines:
default:
- step:
size: 2x # or configure at step level
script:
- ...
Can you please give this a try?