How to fix a 'The "Rule-of-Zero" should be followed'?


I have a code smell identified by SonarCFamily I don’t know how to deal with.
Initially, I have the following code :

class A {
	A() {};
	~A() { /* Important things to do (no resources management) */ };

One bug (S3624) and one code smell (S4963) are then identified. I understand I have a choice to do for performance reasons (according to About sonar cloud C++11 constructor rule Is it BUG?).
So, I try to solve them with the following code :

class A {
	A() {};
	~A() { /* Important things to do (no resources management) */ };

	A(const A&) = delete;
	A& operator=(const A&) = delete;
	A(A&&) = delete;
	A& operator=(A&&) = delete;

Despite of this modification, I still have one code smell (S4963).

Is there a way or a general strategy to solve this kind of code smell ? Or is it just a warning to draw my attention to a performance issue that I need to think about and assume it ?

Developer Edition Version 8.0 (build 29455)
Code Analyzer for C, C++, Objective-C : SonarCFamily 6.7.0 (build 15300)

Hello @Ferris,

I believe S3624 is a good rule to always follow. I don’t believe it the case for S4963.

S4963 isn’t always about performance. It can be about simplicity and maintainability by always using RAII objects as a class field (like unique_ptr) instead of defining special members. Not all types of classes should/can follow it; There are exceptions. This rule helps you to find violations and to make sure that they are all valid exceptions. In short, this rule should be enabled in codebases where the default is to always follow the rule of zero.

We have already added this exception section to the rule description that should show up in the next release:


There are several cases when this rule should not be followed. For instance if your class is manually handling a resource, logging when being constructed/copied, maintaining some kind of counter, having non transient data that should not be copied (like capacity for std::vector)… In that case, it should still follow the rule of 5 (S3624). And you should consider if you can isolate this specific behavior in a base class or a dedicated member data, which would allow you to still follow the rule of 0.


In addition to the answer from @Abbas_Sabra , I’d like to know, out of curiosity, what is the nature of these importants things you need to do. Because apart from logging and resource management, I can’t think of a really good thing to do in a destructor, so I’d like to understand your case better.

Thank you!

Thanks a lot for these precisions.

In my case, to make it simple, I have an object that notifies another object that it is being destroyed. And indeed, the other object manages resources and provides logs. So it can be seen as an implicit ressource management and log system. But I was wondering what happens in general cases, including the examples given as exceptions.

Actually, I was thinking that a class should be define with the rule of 0 or the rule of five. As soon as it is not the case, I understand the two options are proposed (S3624 & S4963).
In my case, I choose the rule of five by explicitly deleting the copy and move operators, which is not the default behavior. So I don’t understand why the rule of 0 is still proposed.


That is what S3624 is doing on its own:
When the "Rule-of-Zero" is not applicable, the "Rule-of-Five" should be followed

S4963 is more like “Always follow the rule of Zero”: If you don’t want to actively push to always use the rule of zero in your codebase(even when the rule of five is followed) S4963 should not be enabled. The value of S4963 is to push toward design change to avoid any sort of manual memory management like the one you mentioned:

an object that notifies another object that it is being destroyed. And indeed, the other object manages resources and provides logs


There are many reasons for people to use the rule of five instead of the rule of 0:

  • The good reasons (being a RAII wrapper around some resource, for instance)
  • The bad reasons
    • because we are used to writing those functions, not really understanding the consequences…
    • because the class used to manually handle resources in the past, has been improved since, but still contains leftover code,
    • because our code editor has a badly designed wizard that automatically adds those functions when writing a new class

From my experience, the cases with bad reasons vastly outnumber the cases with good reasons. And we don’t know how to differentiate them automatically (we may have some ideas for future improvements, but waited for user feedback before going forward). Therefore, we decided to still report a violation of the rule of 0 each time the rule of 5 is used. It does not mean that the code is always bad. Just that it often is, and as such deserves more attention.

If after reviewing the code, you consider that you have a good reason not to use the rule of 0, you can just mark the violation as “won’t fix” (maybe with a comment).

Hope this clarifies our intent with those rules, as well as a course of action you can take when encountering a violation.


Thanks a lot for all these information. It helped us to decide how to deal with this code smell. :ok_hand:

Found this topic after I’ve had a similar issue.

General question:

When a destructor is defined, and all other special member functions are explicitly deleted with = delete, shouldn’t this be considered adhering to the rule-of-five? E.g. isn’t SonarQube wrong in stating that only the destructor is defined?

I.m.o. the rule-of-five is about giving explicit thought to the five special member functions, when you have defined at least one of them. Which is exactly what you’re doing when marking them as = delete.

Note that, for example, clang-tidy considers special member functions that are explicitly deleted as defined, rather than not defined.

See also:

C.21: If you define or =delete any copy, move, or destructor function, define or =delete them all

Hence, =delete is considered ‘defined’.

Hello @Tohnmeister,

Your reasoning is valid. We consider deleted special member as defined. I tried to reproduce what you are describing but I failed. For example, The C++ analyzer doesn’t raise an on this:

class A
A(const A&) = delete;
A& operator=(const A&) = delete;
f(); // …

In principle, this rule should raise an issue when at least one special member function is defined and there is an implicitly provided/defined member function.
Can you share the example where the analyzer is raising the unexpected issue so we can investigate what is happening?