We’re running SonarQube scans as part of our CI pipeline and we want the entire codebase to be treated as “new code” — essentially, we want all files and lines to appear under the “new code” scope in the dashboard and trigger quality gate conditions.
We’ve tried the following:
Set sonar.projectVersion to a new value (2.0) during each run.
Changed the “New Code” definition to “Previous Version” in the UI.
Also tried the “Reference branch” option, pointing it to a dummy branch (baseline) that contains just the initial commit.
Despite this, not all files show up as new — and quality gate conditions on “new code” (like 0% test coverage or reliability rating) don’t always get triggered.
Is there a reliable way to force SonarQube to analyze the full project under the “new code” lens, regardless of history?
Our setup:
SonarQube Version: SonarQube server, version 2025.2
The project is a newly onboarded microservice, so we wanted the SonarQube quality gate to apply only on new code from the start - not retroactively on legacy code or shared libraries. The goal is to ensure greenfield code meets the quality standards as it evolves, without being blocked by unrelated past code.
We expected setting sonar.projectVersion, tweaking the “new code” definition, and using a dummy baseline commit to treat all files as “new code”, but it seems not all files are being picked up as expected.
Would appreciate any recommended config or workaround that forces SonarQube to treat everything in the repo as new code, ideally without relying on a past baseline
I’m still confused. You want SonarQube’s quality gate to start enforcing rules only on new code going forward, while ignoring issues in all existing code (including legacy and shared libraries), so your team is not blocked by pre-existing problems.
However, this approach:
Would actually make all existing/legacy files subject to the quality gate immediately, which This seems contrary to the “don’t be blocked by unrelated past code” objective you described.