False-"new" - sonarqube 10.0 local code-tree

I’m using SonarQube on a local directory, which casually gets completely overwritten by the latest sources (and .class-files). A tar-file gets unpacked, so technically all files receive a new “ctime”-stamp, but almost all of them have still same “mtime” and content. (This is necessary, because the actual build system cannot run sonarqube scanner.)

With this setup I’m seeing the same symptoms as others have already described, just that in my case it doesn’t have to do with github branches: whenever I update the sources (java, c++, sql), and let the scanner run on the updated tree, then I get to see a couple of “new” bugs, smells, debt, etc, but examining them I see that many of them are in code entirely unrelated to the actually updated modules.

The “new code” mode is set to “Previous version”…

Am I too far off the beaten path, or is this a problem worth looking further into?

I’d rather not upgrade, unless the problem is explicitly known and fixed in a newer version.

Hi,

What I think you’re saying is not that SonarQube itself gets overwritten, but that you’re performing analysis in a directory where the project gets overwritten because it’s obtained by unpacking an archive.

If that’s correct, then I’m not surprised you’re getting specious “new” issues: analysis relies on SCM data to understand what code is new. You should be running analysis in the checkout directory, after compile but before the tar file is created.

The actual build system can’t run Java? Really?

As a side note, you should consider upgrading regardless of this question. Non-LTS versions are EOL as soon as each subsequent version is released. There will be no patches or fixes for 10.0. You should upgrade to 10.2 at your earliest convenience and plan to keep up with the ~2mo release cycle.

 
HTH,
Ann

Right, the copy of the project’s source tree (on that extra linux machine capable of running Sonarqube-scanner for all our sources) gets updated to the new sources from unpacking a tar-file.

The original sources are in “CVS”, and the CVS-server is not even reachable (firewall) from that one extra machine running the scanner.

If the “new”-detection is strictly tied to SCM-access, then all I can do is tell my folks that they have no choice than ignore the “new issues”-feature completely–well, that’s what I already told them for the meantime.

Otherwise, if it principally can detect what sources changed and what sources are same as before, then there might be a bug in that detection, which I’d have a hope of getting fixed, eventually.

The actual build system can run Java, but only up to java version 8 - (11 would be available, but not used on that machine, higher versions, such as Java17 required by recent versions of sonarqube, can only be dreamt of - there is no newer java available on oracle-sparc-solaris than 11) . Our source also contains c++, which cannot be scanned at all on solaris – shrug. Therefore we transfer the sources to some linux-machine and have them scanned there.

Hi,

In the absence of SCM integration, analysis does a “best effort” attempt to determine what’s new based, I believe, on file dates. So even that is stymied by your “unpack the archive” methodology.

I think it’s worth noting - in case you decide to move where analysis runs - that while we don’t natively support CVS, there’s an unmaintained (:frowning:) community plugin to provide that integration. No idea whether it still works.

C++ generally needs to be analyzed where it’s built because the data the build-wrapper collects will have the build paths in it. If the paths analysis sees don’t match up to what the build-wrapper collected… :boom:

 
Ann

On unpacking a tar-file, the tar utility sets the timestamp to the original time stored along with the original file in the tar-file.

But even, if a particular file were “changed” (has a new timestamp), it probably shouldn’t present a random selection of findings in it as “new”. Rather, it would ideally recognize, which of the findings of latest scan were not already found previously, (or, alternatively, just call all findings of that file “new”).

Regarding C++ … it wasn’t trivial, but I got it working by means of creating a “compiler-database” json-file that contains all the compiler-invocations as they happened on the real production machine. Also, sonar-scanner “sees” a script named “gcc” that returns the relevant settings like it’s solaris-original. Sonar-scanner generally does a good job on the c++ files, except for the few strange things that I’ve been reporting here.

1 Like