Analyse embedded SQL statements whithin COBOL code using DB2 catalog

  • Using SonarQube Enterprise Edition Version 9.9 (build 65466)
  • SonarQube deployed via zip file

Hi everyone,

I’m trying to make use of a catalog DB2 to analyse embedded SQL statements whithin COBOL code by adding the following configuration to my Sonar properties file :

sonar.cobol.sql.catalog.csv.path=path_to_my_csv_files
sonar.cobol.sql.catalog.defaultSchema=name_of_my_default_schema

As a result I can see on the log that the catalog is used :

(…)
INFO: Load metrics repository
INFO: Load metrics repository (done) | time=19ms
INFO: Loading database catalog from projectBaseDir/repo/db2
(…)

I intentionnaly added a SQL error on one of my codes related to an absence of the use of a primary key on a WHERE clause. Neverthless it doesn’t seem that my embedded SQL statements are being actually analysed since no additional errors were reported by SonarQube.

I guess I’m trying to understand how precisaly this kind of analysis works so I can effectively make use of it.

Does anyone ever managed to use it ? What would be your hints so I can correctly configure it and validate that it works ?

Here it is the mainlines of the sonar file script executed :

#!/bin/bash 

#
echo -e "\n sonar.login=$SONARQUBE_KEY " >> /root/sonar-project.properties

# 
DIRECTORY="repo"

# 
cd $DIRECTORY

#
sonar-scanner \
-Dsonar.projectBaseDir=projectBaseDir \
-Dsonar.host.url=$SONAR_HOST_URL \
-Dsonar.login="$SONARQUBE_KEY" \
-Dsonar.projectKey=com.desjardins.$COMPONENT_NAME:$REPO_NAME \
-Dsonar.links.scm=https://github.com/Repository_Web_Page.git \
-Dsonar.branch.name=$BRANCH_NAME \
-Dsonar.cobol.file.suffixes=cbl \
-Dsonar.cobol.copy.suffixes=cpy \
-Dsonar.c.file.suffixes=- \
-Dsonar.cpp.file.suffixes=- \
-Dsonar.objc.file.suffixes=- \
-Dsonar.pli.file.suffixes=- \
-Dsonar.lang.patterns.tsql="**/*.tsql" \
-Dsonar.lang.patterns.plsql="**/*.pks,**/*.pkb" \
-Dsonar.lang.patterns.sql="**/*.sql" \
-Dsonar.cobol.copy.directories=repo/.../COBOLC/ \
-Dsonar.cobol.sql.catalog.csv.path=path_to_my_csv_files \
-Dsonar.cobol.sql.catalog.defaultSchema=name_of_my_default_schema

Thank you very much for any solution or precision.

Hi,

We don’t have any rules about the absence of the use of a primary key on a WHERE clause, but we have one about using an index: “SELECT” statements should not lead to full table scans
Is that what you’re looking for?

There are multiple reasons that could explain why the rule doesn’t raise an issue where you expect it:

  • The analyzer doesn’t manage to parse the SQL query.
  • The analyzer doesn’t manage to resolve the table that is used in the query to a table in the catalog.
  • The table contains less than 1000 rows (that can be changed with a parameter of the rule in the quality profile).
  • The rule is not activated in the quality profile.

It would be easier to help you if you could share the code where you expect an issue.