Cognitive Complexity in Go

Hello,

I wanted to elaborate a bit of that topic. Error checking in Go is a problem for us with Sonar as it raises the complexity very fast even though most of those statements are basically stopping the function right there with a return and do not add much in the way of complexity. Most of them follow this pattern:

func ValidateUUID(s string) error {
	err := uuid.Validate(s)
	if err != nil {
		return err
	}

	return nil
}

This is such a common Go trope that IDE like Jetbrain GoLand fold them by default:

If many statement involve error checking, the SonarLint complexity score will raise very fast even if the function is completely flat with no other form of branching. And it explodes exponentially as soon as error checking is nested in other branches like iterations.

Ironically, Sonar can then lead to a raise in complexity as the typical response will either be to raise the complexity threshold for the whole project to scores like 25 or more (and thus raising complexity when statements other than error checking are involved), or multiplicating functions with no other goal than to please the linter (and thus raising complexity by call depth and context switch).

To illustrate a bit further, even if it is a bit artificial, this function already scores at 18 in complexity, which raises an issue with the default threshold of 15:

func ValidateUUID(sl []string) error {
	if len(sl) > 0 {
		for _, s := range sl {
			if s != "" {
				err := uuid.Validate(s)
				if err != nil {
					return err
				}

				_, err = uuid.Parse(s)
				if err != nil {
					return err
				}

				_, err = uuid.NewDCEGroup()
				if err != nil {
					return err
				}
			}
		}
	}

	return nil
}

Just ignoring the err != nil pattern altogether is maybe a tad too extreme, but what about only ignoring branches with just a return inside? Or giving those a lower score than 1, like 0.2? Or adding a flat score for all such patterns in a whole function or full-fledged branch?

In case you’re unfamiliar with Go, I’m not sure a straight comparison with Java try/catch mechanism really works. It can if you do something else than returning, but returning an error in Go is an explicit way of reproducing the built-in Java/C#/Python/etc exception behaviour which cancels the function and propagates the error in the call stack.

Hi,

Welcome to the community!

I’ve moved your post to a new topic because it seemed a bit tangential to the post you replied to. Note that the user you replied to had resurrected a year-old thread to ask their question - something the FAQ asks people not to do. :slight_smile:

Regarding your post, I’m a little at a loss. You argue first that the common Go error-checking idiom causes Cognitive Complexity scores to “raise very fast”, and then seem to argue that raising the rule threshold for Go to compensate is a bad thing.

In fact, when we were initially implementing the rule across languages, we started with a threshold of 10, and then looked at the impact across multiple projects. Based on a “gut feel” measurement, it was tuned up to 15 for most languages, but gut feel took it up to 25 for C and C++. It may be that Go (or perhaps just your projects?) needs a higher threshold.

The fact is, that any threshold in a rule is going to be arbitrary. It’s just that some are more arbitrary than most.

But turning to your assertion that measuring Cognitive Complexity leads to

I’d like to address that by addressing your example, which can easily be rearranged a bit to drop the Cognitive Complexity significantly. First, let’s look at where the complexity comes from:

func ValidateUUID(sl []string) error {
	if len(sl) > 0 {  // +1 
		for _, s := range sl {  // +2 - incl 1 for nesting
			if s != "" {  // +3 - incl 2 for nesting
				err := uuid.Validate(s)
				if err != nil {  // +4 - incl 3 for nesting
					return err
				}

				_, err = uuid.Parse(s)
				if err != nil {  // +4 - incl 3 for nesting
					return err
				}

				_, err = uuid.NewDCEGroup()
				if err != nil {  // +4 - incl 3 for nesting
					return err
				}
			}
		}
	}

	return nil
}

As we see, the vast majority of the count here comes from nesting. So dropping even a single level of nesting brings this handily into compliance, not incidentally making the whole thing a bit easier to read:

func ValidateUUID(sl []string) error {
	if len(sl) == 0 {  // +1 
		return nil
	}

	for _, s := range sl {  // +1
		if s != "" {  // +2 - incl 1 for nesting
			err := uuid.Validate(s)
			if err != nil {  // +3 - incl 2 for nesting
				return err
			}

			_, err = uuid.Parse(s)
			if err != nil {  // +3 - incl 2 for nesting
				return err
			}

			_, err = uuid.NewDCEGroup()
			if err != nil {  // +3 - incl 2 for nesting
				return err
			}
		}
	}
	return nil
}

With a single early return, we’ve dropped Cognitive Complexity from 18 to 13, without creating a single extra function.

Of course, we can go even further:

func ValidateUUID(sl []string) error {
	if len(sl) == 0 {  // +1 
		return nil
	}

	for _, s := range sl {  // +1
		if s == "" {  // +2 - incl 1 for nesting
			continue
		}

		err := uuid.Validate(s)
		if err != nil {  // +2 - incl 1 for nesting
			return err
		}

		_, err = uuid.Parse(s)
		if err != nil {  // +2 - incl 1 for nesting
			return err
		}

		_, err = uuid.NewDCEGroup()
		if err != nil {  // +2 - incl 1 for nesting
			return err
		}
	}
	return nil
}

And now we’re down to 9 - half the original 18, with no additional functions and a - IMO - much more readable function.

The point of Cognitive Complexity is not to make you jump through hoops or proliferate extra functions throughout your code (altho there are times when a helper function is the best answer) but to incent, inspire and guide you to write more readable code.

 
Ann

1 Like

Hello Ann and thanks for your answer.

I agree doing away with complexity is always the priority when possible, but this is not always the case. This quick dummy code sample was mostly there to illustrate the compound effect of nested error checking, not really pretending to be an exhaustive example where no other option was available.

I also agree the reasonable complexity threshold will change depending on the language at work, but nevertheless, raising this threshold for the whole project lacks granularity as functions are not equal in regard of error checking. Some will have a lot, others will have none, in which case the ceiling for actual complexity will be much higher.

Hi,

I have a function which inserts a record and some child records into a database using the go standard library and this function has a high score because of the number of tiny error checking if blocks, some of which are inside a loop.

Given the language encourages developers to handle errors where they occur and this results in these extra if blocks, I wonder if there is a case for either increasing the threshold or scoring if blocks that match the pattern if err != nil { lower as suggested above.

How do we request a review of the Go threshold?

Thanks
Dan

Hi Dan,

Welcome to the community!

Are you asking for the default threshold to be raised? Because you know you can edit what’s applied to your project (assuming you have Quality Profile edit rights), right?

 
Ann

Hi,

Yes, I was asking about raising the default. The more I think about it then the more I like the idea of excluding these particular if statements or scoring them lower rather than just increasing the threshold as it targets the specific issue. I’ve seen you have some language specific compensations already and I wonder if this would qualify for one of those.

Dan

Thanks for bringing up this Error Handling pattern in Go.

Both the suggestions to increase the default threshold and score lower the pattern are considered as an improvement for the rule. All the details are available at [SONARSLANG-643] - Jira.

Cheers,
Angelo