Cognitive complexity calculation for a file/project


How do you calculate the cognitive complexity for a file/project?

We know how this is calculated for a function/method. But we can see a huge number for this metric when looking at a single file or a whole project. What does this mean?

E.g. here you see the number for a file:

But there is no function in this file where the complexity is 63.
Do you add all the complexity for all included functions in this file?
If yes, does it make sense?

SQ Data Center Edition
Version 9.9.3


File-level complexity is the sum of complexity for all the methods/functions in the file, plus complexity for any “loose” code in the file.


Hi Ann,

Please, if you can, tell me more about the calculation of cognitive complexity, I really don’t understand this:
I have a project on SQ, if I go to the measures, I can see the cognitive complexity of a specific file (that is a sum of complexity for all methods/functions in that file, as you said), so I am expecting to get the same number if I go to the Issues tab and search the exact same file and in the rule filter select all rules specific for cognitive complexity. But I do not get the same value, in my case I don’t get anything, even if in the measures, this file has a cognitive complexity 4.
So, I have this questions:
1) The number that I can see on measures tab, is it retrieved by the unresolved issues related to cognitive complexity in that file?
2) How can I found out all the rules related to cognitive complexity? Is it enough to go to ‘Rules’, select Code Smell and type cognitive complexity in search bar? And also, how can I get the rules that are for the “loose code”?

Thank you! :slightly_smiling_face:


The rule is “Cognitive Complexity of functions should not be too high”. That “too high” means that this is a threshold rule. We’ve provided a default value (25 for C++), but you can adjust this rule for what’s “too high” for you.

That means that an issue will only be raised when the Cognitive Complexity of a function is higher than the threshold in the rule.

If you really want to see the complexity of every function, then set the threshold to 0 and re-analzye.

Perfect! Since I’m guessing there are no Cognitive Complexity issues in the file at all, that means that the complexity in the file is broken up in to manageable chunks/funcitons, rather than all being lumped together.

Yes! That’s exactly where that 63 comes from.

For more on how Cognitive Complexity is measured, here’s the original paper. (Don’t worry; it’s not long.)