I was recently put in charge to take the responsability of a C++ projet previously assigned to one of my coworker, leaving the company. The project has been submitted to Sonar in november. There were accepted issues, but they’re only about comments. So, the project has the status “Passed”.
When I open the solution in Visual Studio, and use Sonar in connected mode, I get many many Sonar errors. And yes, the code is pretty ugly…
How to explain these differences between SonarQube for IDE and SonarQube Cloud ??
You see the status as " Passed " because there are no violations/issues in the New Code. (New Analysis) or only one analysis was done on the project. It will be based on the Quality Gate set. By default, it will only be for New Code.
Also, the issues in Sonar Cloud and Vscode will be accurate. Please navigate to the code tab in Sonar Cloud, take an example file, check for the violation in the file, open that same file in vscode, and compare it. The issues will be accurate.
I read the doc pointed by the link, and I think there was only one analysis (on a main branch). But so, how to make sure we have a clean code on a main branch, if Sona used in the IDE ?
I did in Sonar Cloud the procedure you describe, on a main branch, but for a given file, there was no error. But many ones in the IDE…
Hi,
When you run the second analysis on the main code, the quality gate will show as " failed " if it has any issues or doesn’t comply with the Quality Gate. It will pass if it does comply.
Please check the quality profile set for that project and the project profile that you connected to in VSCode. There may be a difference in the number of rules activated and the quality profile you are using.
If the project connected in IDE and the project you are viewing are the same, then that’s an issue, but I just checked on my end the same scenario, and it’s working as expected.
Please provide the SQ version and also check for and inform us about any additional plugins you are using in the IDE.
Well, I ran SonarQube Server on my develop branch, and this one still passes.
I don’t have access to the Server configuration.
I don’t see how to check from Visual Studio which quality profile is used. The errors raised in Visual Studio should also be raised on the server. I know they are in the quality profile supposed to be used by the server.
I think it’s more a problem related to the new code status.
If you’re in Connected Mode then the same profiles are used in Visual Studio and SonarQube Cloud. You can see which profiles those are in the project Information.
Unless we’re talking about PR analysis or a short-lived branch, then which issues are raised has little to do with new code detection.
How did this project go from 400 errors to 0, with only 71 accepted issues ? And why do I have these 400 errors locally in Visual Studio (and my VS project is connected) ?
The analysis / scanner log is what’s output from the analysis command. Hopefully, the log you provide - redacted as necessary - will include that command as well.
Well, in spite the link you provide, I’m not sure the log you ask for.
We use SonarCloud through Jenkins.
Should I give you the log created by Jenkins, which mentions Sonar (as “Sonar execution required, running in sonar build wrapper”) ?
I noticed some weirds things in this log:
**17:53:03** [2025-03-04T16:53:03.714Z] Branch already analyzed = false
**17:53:03** [2025-03-04T16:53:03.729Z] The Sonar analysis will have to run twice.
The branch has alredy been analyzed.
Also, some timeOuts stuff:
[Pipeline] // withSonarQubeEnv
[Pipeline] timeout
17:54:01 [2025-03-04T16:54:01.005Z] Timeout set to expire in 10 sec
[Pipeline] {
[Pipeline] waitForQualityGate
17:54:01 [2025-03-04T16:54:01.236Z] Checking status of SonarQube task 'MyToken' on server 'SonarCloud'
17:54:01 [2025-03-04T16:54:01.614Z] SonarQube task 'MyToken' status is 'IN_PROGRESS'
17:54:11 [2025-03-04T16:54:11.006Z] Cancelling nested steps due to timeout
[Pipeline] }
[Pipeline] // timeout
[Pipeline] timeout
17:54:11 [2025-03-04T16:54:11.177Z] Timeout set to expire in 10 sec
[Pipeline] {
[Pipeline] waitForQualityGate
17:54:11 [2025-03-04T16:54:11.280Z] Checking status of SonarQube task 'MyToken' on server 'SonarCloud'
17:54:11 [2025-03-04T16:54:11.567Z] SonarQube task 'MyToken' status is 'SUCCESS'
17:54:11 [2025-03-04T16:54:11.701Z] SonarQube task 'MyToken' completed. Quality gate is 'OK'
So first, I observe that this has been orchestrated with an elaborate set of scripts rather than a typical GitLab pipeline YAML file. You stated in the OP that you inherited this, so clearly you can’t answer for why it was done this way. And you should consider making this a bit more standard. Then you can easily follow the docs for analyzing & it will be easier to help you in the future.
Research tells me that & runs the called process in the background. I wonder if that’s why the logs seem to be lost. It’s been years since I gleefully left Windows behind, so I really don’t know. But can you try executing the two commands separately & see if we get all the logs?
Something in your files made me think you were using GitLab. Here’s the starting point for Jenkins, which also has standardized pipeline configuration.
Just search the forum for “analysis logs”. You should find at least hundreds of hits.
Actually, we use Gitlab as repository, for merge requests, code review… and Jenkins for CI. I lloked the documentation about Jenkins, and it seems we do what we should.
I just sent you an other log. The end is odd:
16:37:58.768 INFO 33/63 files marked as unchanged
16:37:58.768 INFO Analysis cache: 0/0 hits, 125 bytes
16:37:59.757 INFO EXECUTION FAILURE
16:37:59.759 INFO Total time: 16.211
Sonar detected some changes, but does detect anything.
16:37:50.803 INFO Branch name: refs/heads/develop, type: short-lived
So first, this is being analyzed as a short-lived branch. That means that only the files changed in the branch will be included in the analysis. That will easily limit what you see in the UI.
But the real problem is here:
Analysis failed. That will explain not seeing results on the server every time.
Now, from this log it’s not completely obvious to me that the failure was caused by using the wrong version of the build-wrapper, but that’s the first thing I’d address.
And if you update the build-wrapper (you should probably redownload it every time, or at least make sure what you’ve got locally is the version on the server) and analysis still fails, then debug logging might help. To get that, add -Dsonar.verbose=true to the analysis command line.