How to fix a 'The "Rule-of-Zero" should be followed'?

Hello,

I have a code smell identified by SonarCFamily I don’t know how to deal with.
Initially, I have the following code :

class A {
public:
	A() {};
	~A() { /* Important things to do (no resources management) */ };
};

One bug (S3624) and one code smell (S4963) are then identified. I understand I have a choice to do for performance reasons (according to About sonar cloud C++11 constructor rule Is it BUG?).
So, I try to solve them with the following code :

class A {
public:
	A() {};
	~A() { /* Important things to do (no resources management) */ };

	A(const A&) = delete;
	A& operator=(const A&) = delete;
	A(A&&) = delete;
	A& operator=(A&&) = delete;
};

Despite of this modification, I still have one code smell (S4963).

Is there a way or a general strategy to solve this kind of code smell ? Or is it just a warning to draw my attention to a performance issue that I need to think about and assume it ?
Thanks.

Developer Edition Version 8.0 (build 29455)
Code Analyzer for C, C++, Objective-C : SonarCFamily 6.7.0 (build 15300)

Hello @Ferris,

I believe S3624 is a good rule to always follow. I don’t believe it the case for S4963.

S4963 isn’t always about performance. It can be about simplicity and maintainability by always using RAII objects as a class field (like unique_ptr) instead of defining special members. Not all types of classes should/can follow it; There are exceptions. This rule helps you to find violations and to make sure that they are all valid exceptions. In short, this rule should be enabled in codebases where the default is to always follow the rule of zero.

We have already added this exception section to the rule description that should show up in the next release:

Exceptions

There are several cases when this rule should not be followed. For instance if your class is manually handling a resource, logging when being constructed/copied, maintaining some kind of counter, having non transient data that should not be copied (like capacity for std::vector)… In that case, it should still follow the rule of 5 (S3624). And you should consider if you can isolate this specific behavior in a base class or a dedicated member data, which would allow you to still follow the rule of 0.

Thanks,
Abbas

In addition to the answer from @Abbas , I’d like to know, out of curiosity, what is the nature of these importants things you need to do. Because apart from logging and resource management, I can’t think of a really good thing to do in a destructor, so I’d like to understand your case better.

Thank you!

Thanks a lot for these precisions.

In my case, to make it simple, I have an object that notifies another object that it is being destroyed. And indeed, the other object manages resources and provides logs. So it can be seen as an implicit ressource management and log system. But I was wondering what happens in general cases, including the examples given as exceptions.

Actually, I was thinking that a class should be define with the rule of 0 or the rule of five. As soon as it is not the case, I understand the two options are proposed (S3624 & S4963).
In my case, I choose the rule of five by explicitly deleting the copy and move operators, which is not the default behavior. So I don’t understand why the rule of 0 is still proposed.

@Ferris

That is what S3624 is doing on its own:
When the "Rule-of-Zero" is not applicable, the "Rule-of-Five" should be followed

S4963 is more like “Always follow the rule of Zero”: If you don’t want to actively push to always use the rule of zero in your codebase(even when the rule of five is followed) S4963 should not be enabled. The value of S4963 is to push toward design change to avoid any sort of manual memory management like the one you mentioned:

an object that notifies another object that it is being destroyed. And indeed, the other object manages resources and provides logs

Thanks,

There are many reasons for people to use the rule of five instead of the rule of 0:

  • The good reasons (being a RAII wrapper around some resource, for instance)
  • The bad reasons
    • because we are used to writing those functions, not really understanding the consequences…
    • because the class used to manually handle resources in the past, has been improved since, but still contains leftover code,
    • because our code editor has a badly designed wizard that automatically adds those functions when writing a new class

From my experience, the cases with bad reasons vastly outnumber the cases with good reasons. And we don’t know how to differentiate them automatically (we may have some ideas for future improvements, but waited for user feedback before going forward). Therefore, we decided to still report a violation of the rule of 0 each time the rule of 5 is used. It does not mean that the code is always bad. Just that it often is, and as such deserves more attention.

If after reviewing the code, you consider that you have a good reason not to use the rule of 0, you can just mark the violation as “won’t fix” (maybe with a comment).

Hope this clarifies our intent with those rules, as well as a course of action you can take when encountering a violation.

Thanks,

Thanks a lot for all these information. It helped us to decide how to deal with this code smell. :ok_hand:

Found this topic after I’ve had a similar issue.

General question:

When a destructor is defined, and all other special member functions are explicitly deleted with = delete, shouldn’t this be considered adhering to the rule-of-five? E.g. isn’t SonarQube wrong in stating that only the destructor is defined?

I.m.o. the rule-of-five is about giving explicit thought to the five special member functions, when you have defined at least one of them. Which is exactly what you’re doing when marking them as = delete.

Note that, for example, clang-tidy considers special member functions that are explicitly deleted as defined, rather than not defined.

See also: C++ Core Guidelines

C.21: If you define or =delete any copy, move, or destructor function, define or =delete them all

Hence, =delete is considered ‘defined’.

Hello @Tohnmeister,

Your reasoning is valid. We consider deleted special member as defined. I tried to reproduce what you are describing but I failed. For example, The C++ analyzer doesn’t raise an on this:

class A
{
public:
A(const A&) = delete;
A& operator=(const A&) = delete;
~A()
{
f(); // …
}
};

In principle, this rule should raise an issue when at least one special member function is defined and there is an implicitly provided/defined member function.
Can you share the example where the analyzer is raising the unexpected issue so we can investigate what is happening?

Thanks,

Dear @Abbas,

The following code:

class Person
{
public:
    ~Person()
    {
        cout << "destructor" << endl;
    }

    Person(const Person&) = delete;
    Person& operator=(const Person&) = delete;
    Person(Person&&) = delete;
    Person& operator=(Person&&) = delete;

};

raises

cpp:S4963 Remove this class’ destructor so that the class follows the rule of zero.

Hello @Tohnmeister,

You are mixing two rules: the rule of zero(S4963) and the rule of five(S3624).

They both take =delete =default into consideration.
It is up to you to decide if you want to follow the rule of zero in your codebase,
In your case, you are violating the rule of zero since you are defining the destructor explicitly.
On the other hand, you are following the rule of five since you are providing the five special members when you are explicitly providing one(the destructor in your case).

Let me know if you have further questions,

Dear @Abbas,

Thanks again for your answer.

I’m aware of the difference of the two rules. But the thing is, that they are both considered critical rules. Consider the following example:

class FileWriter
{
public:
    explicit FileWriter(FileHandle* handle) :
        mFileHandle(handle)
    {
    }

    ~FileWriter()
    {
        mFileHandle->close();
    }

private:
    FileHandle* mFileHandle;

}; 

For this piece of code the C++ analyzer triggers both S4963 (rule-of-zero should be followed), and S3624 (rule-of-five should be followed when rule-of-zero cannot be followed).

Of course I understand that in this situation we could’ve used a unique_ptr with a custom deleter, but assume that there are less trivial examples in our codebase, where doing so is not possible.

If I explicitly delete the other special functions, like so:

class FileWriter
{
public:
    explicit FileWriter(FileHandle* handle) :
        mFileHandle(handle)
    {
    }

    ~FileWriter()
    {
        mFileHandle->close();
    }

    FileWriter(const FileWriter&) = delete;
    FileWriter& operator=(const FileWriter&) = delete;
    FileWriter(FileWriter&&) = delete;
    FileWriter& operator=(FileWriter&&) = delete;

private:
    FileHandle* mFileHandle;

};

S3624 is satisfied. But S4963 is still triggered.

Hence, I have no way of satisfying both rules in this case. I would expect S4963 not to be triggered if we explicitly adhere to the rule-of-five.

How can we, in the above example, modify the code, so that it does not violate either S3624 nor S4963?

@Tohnmeister,

It depends on you and the nature of your codebase. If you want to satisfy either the rule of zero or the rule of five: you should enable S3624 and disable S4963.

If you want to regularly follow the rule of zero, then you should enable S4963. It is your decision since you know better if the rule of zero suits your codebase.

Most of the time you should be able to follow the rule of zero by using a RAII object(like unique_ptr) as a field. Even in complex cases, you should be able to move the code that you wrote in the destructor to a private function and call that function, for example, in the unique_ptr deleter.

Even if you want to follow the rule of zero in your codebase, there might be always an exception as pointed out by the rule description:

There are several cases when this rule should not be followed. For instance, if your class is manually handling a resource, logging when being constructed/copied, maintaining some kind of counter, having non-transient data that should not be copied (like capacity for std::vector)… In that case, it should still follow the rule of 5 (S3624). And you should consider if you can isolate this specific behavior in a base class or a dedicated member data, which would allow you to still follow the rule of 0.

In those rare cases, you should just close the issue by “Resolve as won't fix”.

Thanks,

2 Likes

@Abbas, thanks again for your detailed explanation.