Must-share information (formatted with Markdown):
- which versions are you using (SonarQube, Scanner, Plugin, and any relevant extension)
SonarQube Developer Edition - 8.3.1 and Python Scanner
- what are you trying to achieve
PySpark (Running PySpark in Azure Databricks notebook) code quality check.
- what have you tried so far to achieve this
Using a project which is written in PySpark and setup the code quality check in SonarQube but getting errors -
Spark is not defined and same error is coming for other constants defined in other files
We are running PySpark code in Databricks which runs on a spark library and our code is using that in-built library. But SonarQube is not able to detect it and throwing errors. Attached screenshot is one of the exapmple. Please guide us that how to measure the code quality of PySpark Code
There is one more issue that we are defining some constants in separate file which is being called in the main python file and these constants are being used in the main file only.
Like here we have a constant called SQL which is defined in a separate file and calling it here in the main file and getting error as above image.
Please help and let us know if you have any further questions.