Allow marking duplications as "false-positives"/"won't fix"

(Michel Albert) #1

I’ve recent run into a situation where a code-block was duplicated by design. And now, our integration pipeline is failing because the new commits trigger a quality-gate fail due to this duplication.

We write code which is used to interact with network devices and one of the vendors recently spit their running OS on those devices. They now have a separate life-cycle. Part of our code was duplicated to represent this split, so we can evolve both code-bases independently.

It would be nice to be able to flag that duplication as “won’t fix”.

We could remove the duplication, but it was an internal design-decision to duplicate it. But this decision is now blocking the integration pipeline so we will remove the duplication again. Even though it’s not quite what we want. But at least this way the pipeline will pass.

It would be nice to be able to flag duplications as “won’t fix” for some of these corner cases.

(G Ann Campbell) #2


Could you elaborate on the QG condition that’s failing? Is it using one of the duplications metrics, or is it a condition related to issues?

In general, there’s no way to mark anything metric-related as FP/WF. The only way to do that is to challenge the counting algorithm & persuade us to change it. :slight_smile:


(Peter Connor) #3

This is something we would like implemented as well…

The only type of example that keeps popping up for us is around JSON objects in an array, for example:

var arr = [
	{"foo": 1, "bar": 2},
	{"foo": 3, "bar": 4},
	{"foo": 5, "bar": 6}

The views with our development team are pretty split about this, half saying it is duplications, half saying it isn’t. The people that have a problem with duplications being flagged on this is when they refactor their code it becomes more complex and harder to read. The benefit being though, you know all properties are consistent across all objects.

var arr = [];

function pushObject(foo, bar) {
	arr.push({"foo": foo, "bar": bar);


But like the OP says, we have no choice but to do it via the 2nd methods because otherwise the quality gate has been failed and there is no way to get around it apart from exclude the whole file for coverage analysis, which is not what we want to do.