Sonar "Run Code Analysis" AzDo pipeline task started erroring for nodejs project

We started getting errors on the “Run Code Analysis” task (Version: 1.29.1) for one of our AzDo CI pipeline builds. We have not changed program code. The latest PR updated a kubernetes yaml file thats external to the program being analyzed.

This is the beginning of the errors in the build log

20:41:05.501 DEBUG: Launching command C:\hostedtoolcache\windows\node\12.16.1\x64\node.exe D:\a\1\s\.scannerwork\.sonartmp\eslint-bridge-bundle\package\bin\server 50094 127.0.0.1 D:\a\1\s\.scannerwork true false D:\a\1\s\.scannerwork\.sonartmp\eslint-bridge-bundle\package\custom-rules17181302863441391164\package
##[error]20:41:06.274 ERROR: internal/modules/cjs/loader.js:1174
20:41:06.274 ERROR:       throw new ERR_REQUIRE_ESM(filename, parentPath, packageJsonPath);
20:41:06.274 ERROR:       ^
20:41:06.274 ERROR: internal/modules/cjs/loader.js:1174
20:41:06.274 ERROR:       throw new ERR_REQUIRE_ESM(filename, parentPath, packageJsonPath);
20:41:06.274 ERROR:       ^
##[error]20:41:06.274 ERROR: 

this continues for another hundred lines or so. The build summary page has 11 errors on it, yet the pipeline still passes. The task is not set to Continue on Error and no tasks are set to Always Run

What can be done to fix this?

Hye @StingyJack

Can you try moving the version to Node 14 (Node 12 is deprecated for analysis – I’m not saying we shouldn’t try to understand the error since it is still technically supported, but this might help us understand where the issue is)

The program is running on Node 12. I can ask if a task be inserted after the tests are all run but before the sonar task runs that will use node 14, but its probably going to be faster for you to revert/address the problem that just started to appear with the tool in the last week or so.

We have a build for a different program (also running on node 12) that built without these errors on the 16th. It does have an ERROR: Failed to delete temp folder java.nio.file.AccessDeniedException: D:\a\1\s\.scannerwork\.sonartmp\7638323707350841477\jni.dll that also does not fail the build but it seems to have still been able to publish the results.

If these tools are throwing errors, or are unable to perform or report the analysis results, they need to be failing the entire build because they could be letting code that does not meet the quality standard into the codebase.

Hey,

In fact there was a fix in April that we require Node 12.22 as minimum, that’s probably why your analysis was failing (with Node 12.16). (issue FP S4325 (`no-unnecessary-type-assertion`): `Object.values()` result · Issue #3235 · SonarSource/SonarJS · GitHub).

@Lena - Please don’t miss this statement. It’s a bug regardless of node version.

Hey Andrew,

I see your point, and indeed sorry for ignoring it.
In fact that was behavior several years ago, but we decided that we can’t block the whole analysis if some sensor is not able to perform analysis. There are warnings in UI which we use to let user know if something went wrong during analysis, but we use it only if we are sure that user can fix it (e.g. wrong report path to coverage).

So unfortunately there is not much I can do here. But thanks for sharing

For about two months, SonarCloud was reporting there were 0 lines of code in a repository because the scanner was swallowing errors.

The build pipeline was reporting errors but not failing the job?

If this kind of thing could be scanned for, I’m sure that either of these two things would have Sonar rules reporting both as blocker by default.

EDIT: @Lena That last part is a reference to the irony of a company that makes code quality tools and that is sometimes very opinionated on these topics, and in this case is doing something intentional that is widely considered a bad design. Programs that are not operating correctly should Fail Fast, and fail visibly. Hiding errors that can corrupt the state or results in this case has lead to a poorer user experience than if it had just failed when it could not do the job expected of it.

2 Likes