[java] A false negative about the rule RSPEC-4143

Version Used

  • sonarqube-9.2.2.50622
  • sonar-scanner-cli-4.6.2.2472-linux

error observed

  • RSPEC-4143

steps to reproduce

  1. Directly detect the minimized case below by SonarQube and Sonar-Scanner
import java.util.*;
public class Test {
    public static void foo() {
        HashMap<String, String> letters = new HashMap<>();
        letters.put("a", "Apple");
        int a;
        int b;
        int c = a + b;
        letters.put("a", "Boy");
    }
}
  1. Then, we can not find any warings at line 9.

potential workaround
I think we should expand the scope of analysis, not only analyze the consecutive statements.

Hey @Belle,

Thanks for the feedback. You are right, this is a False Negative (FN). However, this one is very unlikely to be fixed anytime soon. The rule has been initially written to target lists of assignments, typically found in configuration classes, and repetitively setting values, many statements in a row.

class Config {

  void zoo() {
    animals.put("otter", "mammal");
    animals.put("platypus", "mammal");
    animals.put("beaver", "mammal");
    // ...
    animals.put("duck", "bird");
    animals.put("platypus", "bird"); // issue S4143
  }
}

Indeed, trying to handle what is in-between two assignments with the same key is extremely difficult with a static analysis approach, since we can not easily infer what are the consequences of the operations occurring. Example:

class A { 
  void foo(Map<String,String> map)) {
    map.put("a", "start");
    bar(); // what if the map is legitimately modified for key "a", or if there is some asynchronous stuff happening ?
    map.put("a", "done");
  }

  void bar() {
    // ...
  }

We believe that if we try to update the rule to cover more cases, such as the one you describe, it will ultimately lead to introducing too many new False Positives (FP), winning only a few new True Positive (TP) in the battle, and at the cost of a now very complex rule to maintain on our hand.

At SonarSource, we try to maximize the rate of TPs over FPs… at the cost of some known FNs. Of course, we only do that in order to make sure that the issues that our analyzers raise always make total sense for our users, and we try to fix as much as possible any reported FPs. We believe that in the case of S4143, the ratio won’t be good if we try to cover more complex cases… and if users start not trusting the rule anymore, we all lose.

Cheers,
Michael

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.