Gitlab - SonarCloud Stopped Analysis (Last Analysis Quite A While)

I have verified the steps as outlined here for the Gitlab integration and can verify that the:

  • gitlab-ci.yml reflects what is written
  • sonar-project.properties is also configured
  • token has been configured in the CI/CD variables

I can also verify that the section to run Sonar is being executed via the following logs:

Running Sonar.....
Saving cache for successful job
00:01
Creating cache sonarcloud-mr-###-non_protected...
WARNING: .sonar/cache: no matching files. Ensure that the artifact path is relative to the working directory (/builds/project/appname) 
Uploading cache.zip to https://storage.googleapis.com/gitlab-com-runners-cache/project/#########/sonarcloud-mr-###-non_protected 
Created cache

I am running out of ideas why the SonarCloud project instance is not reflecting what should have been the analysis performed via the Gitlab job.

Any area which I need to look at?

It doesn’t quite look like SonarCloud analysis is running.

Can you post your gitlab-ci.yml file here?

Below is the gitlab-ci.yml file entry:

workflow:
  rules:
    - if: '$CI_COMMIT_BRANCH == "dev"'
      variables:
        ENV: "dev"
        ENVIRONMENT_NAME: "MyApplicationName"
        APPLICATION_NAME: "MyApplicationName"
        ENVIRONMENT_URL: "https://myapplicationname.applicationnamedev.com/"
        BACKEND_URL: "https://api.applicationnamedev.com/"
        SONAR_USER_HOME: "${CI_PROJECT_DIR}/.sonar" # Defines the location of the sonar analysis task cache
        GIT_DEPTH: "0" # Tells git to fetch all the branches of the project, required by the sonar analysis task
      when: always
    - if: '$CI_COMMIT_BRANCH == "qa"'
      variables:
        ENV: "qa"
        ENVIRONMENT_NAME: "MyApplicationName"
        APPLICATION_NAME: "MyApplicationName"
        ENVIRONMENT_URL: "https://myapplicationname.applicationnameqa.com/"
        BACKEND_URL: "https://api.applicationnameqa.com/"
      when: always
    - if: '$CI_COMMIT_BRANCH == "master"'
      variables:
        ENV: "prod"
        ENVIRONMENT_NAME: "MyApplicationName"
        APPLICATION_NAME: "MyApplicationName"
      when: always
    - if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
      variables:
        ENV: "mr"
        SONAR_USER_HOME: "${CI_PROJECT_DIR}/.sonar" # Defines the location of the sonar analysis task cache
      when: always

# Ordered stages (ran sequentially)
# Note: backend must be deployed before frontend
stages:
  - install_deps
  - test
  - build
  - deploy-frontend
  - updateCache
  - backupProductionDB

#######################
#### JOB TEMPLATES ####
#######################

.test_codequality:
  stage: test
  allow_failure: true
  cache:
    key: "${CI_JOB_NAME}"
    paths:
      - .sonar/cache
  image:
    name: sonarsource/sonar-scanner-cli:latest
    entrypoint: [""]
  script:
    - cd frontend
    - ls ./
    - sonar-scanner

.test_e2e-template:
  stage: test
  image: cypress/browsers:node14.7.0-chrome84
  script:
    # run Cypress tests
    - cd frontend
    - SECONDS=0
    - npm install
    - echo "Completed installing dependencies in $SECONDS seconds"
    - SECONDS=0
    - npx cypress run --env password=$CYPRESS_USER_PASS,customer_admin_user=$CYPRESS_CUSTOMER_ADMIN_USER,customer_staff_user=$CYPRESS_CUSTOMER_STAFF_USER,customer_manager_user=$CYPRESS_CUSTOMER_MANAGER_USER,BACKEND_URL=$BACKEND_URL --config baseUrl=$ENVIRONMENT_URL
    - echo "Completed e2e in $SECONDS seconds"
  artifacts:
    when: always
    paths:
      - /builds/applicationdir/myapplicationname/frontend/cypress/videos/**/*.mp4
      - /builds/applicationdir/myapplicationname/frontend/cypress/screenshots/**/*.png
    expire_in: 1 day

#######################
######## JOBS  ########
#######################

# To test -> sudo gitlab-runner exec docker <section_name>

#test_gitlab_runner:
#  image: node:14
#  stage: build
#  script:
#    - echo hello

# Job to install dependencies (node_modules for the frontend)
install_dependencies_frontend:
  image: node:14
  stage: install_deps
  cache:
    key:
      files:
        - frontend/package.lock.json
    paths:
      - frontend/node_modules
    policy: pull-push
  script:
    - echo "Installing FRONTEND Dependencies....."
    - cd frontend
    # Install runs to update package.lock
    - npm install
    # ci runs to install dependencies from package.lock as a fresh install
    - npm ci

test_unit:
  image: node:14
  stage: test
  cache:
    key:
      files:
        - frontend/package.lock.json
    paths:
      - frontend/node_modules
    policy: pull
  environment:
    name: $ENV
  rules:
    - if: '$ENV == "mr"'
      when: on_success
    - if: '$ENV == "dev" && $CI_PIPELINE_SOURCE == "schedule"'
      when: on_success
    - if: '$ENV == "qa" && $CI_PIPELINE_SOURCE == "schedule"'
      when: on_success
  script:
    - echo "Running UNIT TESTS....."
    - cd frontend
    - npm run test:unit
  artifacts:
    expire_in: 1 week
    when: on_success
    paths:
      - frontend/unitTestResults.json
      - frontend/coverage

test_branch:
  image: node:14
  stage: test
  cache:
    key:
      files:
        - frontend/package.lock.json
    paths:
      - frontend/node_modules
    policy: pull
  environment:
    name: $ENV
  rules:
    - if: '$ENV == "mr" && $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "qa"'
      when: on_success
    - if: '$ENV == "mr" && $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"'
      when: on_success
  script:
    - echo "Running BRANCH TESTS....."
    - if [ "$CI_MERGE_REQUEST_SOURCE_BRANCH_NAME" == "dev" ] && [ "$CI_MERGE_REQUEST_TARGET_BRANCH_NAME" == "qa" ]; then exit 0; fi
    - if [ "$CI_MERGE_REQUEST_SOURCE_BRANCH_NAME" == "qa" ] && [ "$CI_MERGE_REQUEST_TARGET_BRANCH_NAME" == "master" ]; then exit 0; else exit 1; fi

test_e2e:
  extends: .test_e2e-template
  environment:
    name: $ENV
  rules:
    - if: '$CI_PIPELINE_SOURCE == "schedule"'

test_codequality:
  extends: .test_codequality
  rules:
    - if: '$ENV == "mr"'
      when: on_success
    - if: '$ENV == "dev" && $CI_PIPELINE_SOURCE == "schedule"'
      when: on_success
    - if: '$ENV == "qa" && $CI_PIPELINE_SOURCE == "schedule"'
      when: on_success
  script:
    - echo "Running CODE QUALITY TEST....."

build_frontend:
  image: node:14
  stage: build
  cache:
    key:
      files:
        - frontend/package.lock.json
    paths:
      - frontend/node_modules
    policy: pull
  environment:
    name: $ENV
  rules:
    - if: '$ENV == "mr"'
      when: on_success
    - if: '$ENV == "dev"'
      when: on_success
    - if: '$ENV == "qa"'
      when: on_success
    - if: '$ENV == "prod"'
      when: on_success
  script:
    - echo "Running Build FRONTEND....."
    - cd frontend
    - npm install
    - if [ "$ENV" == "dev" ]; then npm run build-pipe-dev; fi
    - if [ "$ENV" == "qa" ]  || [ "$CI_MERGE_REQUEST_TARGET_BRANCH_NAME" == "qa" ] || [ "$CI_MERGE_REQUEST_SOURCE_BRANCH_NAME" == "dev" ]; then npm run build-pipe-qa; fi
    - if [ "$ENV" == "prod" ] || [ "$CI_MERGE_REQUEST_TARGET_BRANCH_NAME" == "master" ] || [ "$CI_MERGE_REQUEST_SOURCE_BRANCH_NAME" == "qa" ] || [ "$CI_MERGE_REQUEST_TARGET_BRANCH_NAME" == "dev" ]; then npm run build-pipe-prod; fi
  # Artifacts make the folders listed available in the deployment stage (good for prod-frontend because it is not checked in)
  artifacts:
    expire_in: 1 week
    paths:
      - frontend/build

# job for backing up the backend.
backup_backend:
  image: ruby:2.5
  before_script:
    - apt-get update -qy
    - apt-get -y install zip unzip
  stage: build
  cache:
    key:
      files:
        - backend/package.lock.json
    paths:
      - backend/node_modules
    policy: pull
  environment:
    name: $ENV
  rules:
    - if: '$ENV == "mr"'
      when: on_success
    - if: '$ENV == "dev" && $CI_PIPELINE_SOURCE != "schedule"'
      when: on_success
    - if: '$ENV == "qa"'
      when: on_success
    - if: '$ENV == "prod"'
      when: on_success
  script:
    - echo "Running Backup BACKEND....."
    - cd backend; zip -r ../backend.zip .
  # Artifacts make the folders listed available in the deployment stage (good for prod-frontend because it is not checked in)
  artifacts:
    expire_in: 1 week
    paths:
      - backend.zip

deploy_frontend:
  image: python:latest
  stage: deploy-frontend
  environment:
    name: $ENV
  rules:
    - if: '$ENV == "dev" && $CI_PIPELINE_SOURCE != "schedule"'
      when: on_success
    - if: '$ENV == "qa" && $CI_PIPELINE_SOURCE != "schedule"'
      when: on_success
    - if: '$ENV == "prod"'
      when: on_success
  script:
    - echo "Running Deploy FRONTEND....."
    - pip install awscli
    - echo deploying frontend
    # Empty previous files from frontend bucket
    - aws s3 rm s3://$S3_FRONT_BUCKET_NAME1/ --recursive
    - aws s3 rm s3://$S3_FRONT_BUCKET_NAME2/ --recursive
    # Add new built files to frontend bucket
    - aws s3 cp ./frontend/build s3://$S3_FRONT_BUCKET_NAME1/ --recursive --include "*"
    - aws s3 cp ./frontend/build s3://$S3_FRONT_BUCKET_NAME2/ --recursive --include "*"
    - if [ "$ENV" == "prod" ]; then aws cloudfront create-invalidation --distribution-id *************** --paths "/*"; fi
  needs: ["build_frontend"]

backup_production_db:
  image: mongo:4.2.10
  stage: backupProductionDB
  environment:
    name: $ENV
  rules:
    - if: '$ENV == "prod"'
      when: on_success
  script:
    - label=`date +%Y_%m_%d_%H_%M_%S`
    - label="Backup_$label"
    - echo "${label}"
    - mongodump --forceTableScan --uri mongodb+srv://$MONGO_USER:$MONGO_PASS@myapplicationname.driff.mongodb.net/test
    - mongorestore --uri mongodb+srv://$MONGO_USER:$MONGO_PASS@myapplicationname.driff.mongodb.net --db $label dump/test

invalidate-files:
  stage: updateCache
  image:
    name: amazon/aws-cli
    entrypoint: [""]
  environment:
    name: $ENV
    url: myapplicationname.com
  rules:
    - if: '$ENV == "prod"'
      when: on_success
  script:
    - echo "This job invalidates the files so that the cache can update"
    - aws --version
    - aws cloudfront create-invalidation --distribution-id *************** --paths "/*"

I can also verify through the pipeline logs that the test_codequality ran successfully as below:

Pulling docker image sonarsource/sonar-scanner-cli:latest ...
Using docker image sha256:c5**********************************************************f1 for sonarsource/sonar-scanner-cli:latest with digest sonarsource/sonar-scanner-cli@sha256:4b***********************************************************dda ...
Preparing environment
00:03
Running on runner-x******o-project-########-concurrent-0 via runner-x******o-shared-1686144781-8ef056b5...
Getting source from Git repository
00:03
Fetching changes with git depth set to 50...
Initialized empty Git repository in /builds/applicationdir/applicationname/.git/
Created fresh repository.
Checking out 7463434f as detached HEAD (ref is refs/merge-requests/779/head)...
Skipping Git submodules setup
$ git remote set-url origin "${CI_REPOSITORY_URL}"
Restoring cache
01:05
Checking cache for test_codequality-6-non_protected...
Downloading cache.zip from https://storage.googleapis.com/gitlab-com-runners-cache/project/########/test_codequality-6-non_protected 
Successfully extracted cache
Executing "step_script" stage of the job script
00:01
Using docker image sha256:c5**********************************************************f1 for sonarsource/sonar-scanner-cli:latest with digest sonarsource/sonar-scanner-cli@sha256:4b***********************************************************dda ...
$ echo "Running CODE QUALITY TEST....."
Running CODE QUALITY TEST.....
Saving cache for successful job
00:01
Creating cache test_codequality-6-non_protected...
.sonar/cache: found 1082 matching artifact files and directories 
Archive is up to date!                             
Created cache
Cleaning up project directory and file based variables
00:00
Job succeeded

Let’s focus on this:

Do you see the output from ls ./ in your Gitlab output?

Was able to figure things out

Essentially, the script in the job itself is preventing the script in the job template from running, hence, removing the script from the job resolved the issue.

Thanks @Colin for looking initially. Your latest response will pave the way to discovering this.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.