Default Branch Has Not Been Analyzed Yet

* CI system used: Azure DevOps
* Languages of the repository: Node.js, Javascript
* Long-lived branches pattern: (master|feature|develop).*

We have a Sonarcloud scan issue at the Federal Agency that I am working for. We use Azure Devops and a simple Pull Request into our Main Branch (develop) only shows analysis for the short lived branch (feature), but not develop. What does it take to get this product to analyze the main branch?? Here is my entire build pipeline in azure-pipelines.yml. The Pull Request source branch(feature/pfa-sonarcloud) is analyzed, but the develop (long lived branch) is never analyzed. New code is set to 30 days.

variables:
  # Agent VM image name
  vmImageName: 'ubuntu-18.04'
  npm_config_cache: $(Pipeline.Workspace)/.npm

trigger:
  batch: true
  branches:
    include:
      - master
      - develop
      - releases/*
      - preview-space

jobs:
  - job: build
    displayName: Build
    pool:
      vmImage: $(vmImageName)
    steps:
      - template: templates/npm-steps.yml
      - task: Npm@1
        displayName: Prune
        inputs:
          command: custom
          customCommand: run prune
      - task: SonarCloudPrepare@1
        displayName: SonarCloud Prepare
        inputs:
          SonarCloud: 'macro-sonar-connection'
          organization: 'macro-pfa-org-key'
          scannerMode: 'CLI'
          configMode: 'manual'
          cliProjectKey: 'macro-pfa_pfa'
          cliProjectName: 'pfa'
      - task: Npm@1
        displayName: Create Cloudfoundry Build
        inputs:
          command: custom
          customCommand: run cf:build
      - task: SonarCloudAnalyze@1
        displayName: 'SonarCloud Run Code Analysis'
      - task: SonarCloudPublish@1
        displayName: 'SonarCloud Publish Quality Gate Result'
      - task: ArchiveFiles@2
        inputs:
          rootFolderOrFile: ./dist/
          archiveFile: '$(Build.ArtifactStagingDirectory)/packages.zip'
      - publish: $(Build.ArtifactStagingDirectory)/packages.zip
        artifact: drop

  - job: unit_tests
    displayName: Unit Tests
    pool:
      vmImage: $(vmImageName)
    steps:
      - template: templates/npm-steps.yml
      # - task: Npm@1
      #   condition: or(
      #       eq(variables['Build.SourceBranch'], 'refs/heads/develop'),
      #       startsWith(variables['Build.SourceBranch'], 'refs/heads/releases')
      #     )
      #   displayName: Prepare for Audit
      #   inputs:
      #     command: custom
      #     customCommand: run audit:prep
      # - task: Npm@1
      #   displayName: Run Audit
      #   condition: or(
      #       eq(variables['Build.SourceBranch'], 'refs/heads/develop'),
      #       startsWith(variables['Build.SourceBranch'], 'refs/heads/releases')
      #     )
      #   inputs:
      #     command: custom
      #     customCommand: run audit
      - task: Npm@1
        displayName: Linting
        inputs:
          command: custom
          customCommand: run lint
      - task: Npm@1
        displayName: Run Unit Tests
        condition: succeededOrFailed()
        inputs:
          command: custom
          customCommand: run test
      - task: Npm@1
        displayName: Run Unit Tests With Coverage
        condition: succeededOrFailed()
        inputs:
          command: custom
          customCommand: run test:cov
      - template: templates/publish-test-report.yml
        parameters:
          package: bank-api
      - template: templates/publish-test-report.yml
        parameters:
          package: datamanager
      - template: templates/publish-test-report.yml
        parameters:
          package: bankfind-ui
      - template: templates/publish-test-report.yml
        parameters:
          package: common-modules

  - job: e2e_tests
    displayName: E2E Tests
    pool:
      vmImage: $(vmImageName)
    steps:
      - template: templates/npm-steps.yml
      - bash: |
          sudo sysctl -w vm.max_map_count=262144
          echo fs.inotify.max_user_watches=524288 | sudo tee -a /etc/sysctl.conf && sudo sysctl -p
        displayName: Increase Max Virtual Memory
      - task: Npm@1
        displayName: Start Docker
        inputs:
          command: custom
          customCommand: run docker-compose:up
      - template: templates/datamanager-e2e.yml
      - task: Npm@1
        displayName: Run Bank API E2E Tests
        condition: succeededOrFailed()
        inputs:
          command: custom
          customCommand: run test:ci:e2e
          workingDir: ./packages/bank-api
      - task: PublishTestResults@2
        displayName: Bank API - Publish E2E Test Results
        condition: succeededOrFailed()
        inputs:
          testResultsFiles: 'packages/bank-api/test-report-e2e.xml'
          testRunTitle: 'E2E Tests - Bank API'
      - task: Npm@1
        displayName: Run BankFind UI E2E Tests
        condition: succeededOrFailed()
        inputs:
          command: custom
          customCommand: run test:e2e
          workingDir: ./packages/bankfind-ui
      - task: PublishCucumberReport@1
        displayName: 'Publish CucumberReport'
        inputs:
          jsonDir: ./packages/bankfind-ui/features/
          outputPath: ./packages/bankfind-ui/features/
          theme: 'bootstrap'
          reportSuiteAsScenarios: true
          name: 'Regression Tests'
          title: BankFind UI - E2E
  - deployment: deploy
    displayName: Deploy
    dependsOn:
      - build
      - unit_tests
      - e2e_tests
    pool:
      vmImage: $(vmImageName)
    environment: 'fdic-nongfe-dev'
    condition: |
      and(
        succeeded(),
        or(
          eq(variables['Build.SourceBranch'], 'refs/heads/develop'),
          startsWith(variables['Build.SourceBranch'], 'refs/heads/releases'),
          eq(variables['Build.SourceBranch'], 'refs/heads/preview-space')
        )
      )
    strategy:
      runOnce:
        deploy:
          steps:
            - download: current
              artifact: drop
            - task: ExtractFiles@1
              inputs:
                archiveFilePatterns: '$(Pipeline.Workspace)/drop/packages.zip'
                destinationFolder: drop
                cleanDestinationFolder: true
            - bash: curl -L "https://packages.cloudfoundry.org/stable?release=linux64-binary&source=github" | tar -zx
              displayName: Install CF CLI
            - bash: ./cf api api.fr.cloud.gov
              displayName: Set CF API URL
            - bash: ./cf auth
              displayName: Authenticate CF CLI
              env:
                CF_PASSWORD: $(CF_PASSWORD)
                CF_USERNAME: $(CF_USERNAME)
            - bash: ./cf t -o fdic-nongfe -s pfa-dev
              displayName: CF Target Non-GFE
            - bash: ./cf push -f $(Build.SourcesDirectory)/drop/dist/manifest.yml --vars-file $(Build.SourcesDirectory)/drop/dist/environments/vars/nongfe.yml
              displayName: CF Push
            # - bash: ./cf terminate-task pfadatamanagerapi $(./cf tasks pfadatamanagerapi | grep -m 1 'work-manager' | cut -d' ' -f1)
            #   # gets the ID number of the most recent "work-manager" task
            #   displayName: Cancel any previous work-manager task
            #   continueOnError: true
            # - bash: ./cf run-task pfadatamanagerapi "npm run task:dist:work-manager" -m 2G --name work-manager
            #   displayName: Run all indexing tasks

Hey there.

When the pipeline runs on the develop branch, is the SonarCloudAnalyze ran? If so, where do the pipeline logs point you when analysis is finished?

For example:

INFO: Analysis report generated in 2387ms, dir size=196 KB
INFO: Analysis report compressed in 27ms, zip size=40 KB
INFO: Analysis report uploaded in 364ms
INFO: ANALYSIS SUCCESSFUL, you can find the results at: https://sonarcloud.io/dashboard?id=test-dot-folder
INFO: Note that you will be able to access the updated dashboard once the server has processed the submitted analysis report
INFO: More about the report processing at https://sonarcloud.io/api/ce/task?id=AYCYkVOIhkHfatiQXPSa
INFO: Analysis total time: 50.101 s

INFO: Analysis report generated in 875ms, dir size=1 MB
INFO: Analysis report compressed in 530ms, zip size=699 KB
INFO: Analysis report uploaded in 1784ms
INFO: ANALYSIS SUCCESSFUL, you can find the results at: https://sonarcloud.io/dashboard?id=macro-pfa_pfa&pullRequest=1743
INFO: Note that you will be able to access the updated dashboard once the server has processed the submitted analysis report
INFO: More about the report processing at https://sonarcloud.io/api/ce/task?id=AYE_eeAkvf3bE_eBbQhJ
INFO: Analysis total time: 4:43.180 s
INFO: ------------------------------------------------------------------------
INFO: EXECUTION SUCCESS
INFO: ------------------------------------------------------------------------
INFO: Total time: 5:23.318s
INFO: Final Memory: 154M/517M
INFO: ------------------------------------------------------------------------
Finishing: SonarCloud Run Code Analysis

It looks like in this case, a pull request was analyzed.

Was this build run in the context of a pull request?

You would need to manually run the build (including the SonarCloudAnalyze step) on the develop branch, or merge your pull request into the develop branch to trigger a pipeline on your main branch.

I will update this thread when we merge to develop at the end of our Sprint.

Default branch was scanned. I had to perform merge to develop branch. Issue resolved.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.