Git fetch-depth implications

Hi there,

We read that it’s recommended to disable shallow clones completely by using a value of 0, however this can get pretty slow for large repositories with many commits.For just the new code detection a fetch-depth of 2 should be sufficient.

Could you please tell me what are the implications / what kind of SonarCloud features will stop working when changing the fetch-depth from 0 -> 2?


Hi @flobernd ,

That’s a very good question.
We are well aware that cloning big repositories can take a long time, and we give that configuration advice because we believe it is very important for the value you’ll get out of SonarCloud.

SonarCloud uses the full Git history for several features.
We can mention

  • new code detection: on pull requests, we don’t consider just the last commit but all the commits that are not on the target branch. For that, we need a history long enough to find the common commit. On long branches, the new code period can be configured in different ways but we always need a longer history
  • the Blame information, displayed on the left of the code on the Code tab, and the automatic issue assignment that uses the blame information
  • the issue backdating. This feature is used to not detect an issue as new when it is old code and we just add a new rule on our analyzers: if the code is here since 2012, we use 2012 as the date of the issue so it doesn’t break the conditions on new code of the quality gate for something that is here for a long time

In a nutshell, reducing the clone depth or fetch depth can lead to issues appearing when they should not, with impacts on the quality gate and creating some noise.

Does it clarify?