Hi team,
we are planning to integrate sonarcube into Azure Devops pipelines in order to check against code written and executed in Azure Databricks notebooks. We managed to configure the initial setup. The problem is that imported functions from other notebooks are not identified as such. Other notebooks are called via %run magic command. All imported functions lead to a bug (saying function is not defined).
Is there any way to configure that via % run command imported functions from other notebooks are identified?
we tried python Databricks notebooks and the python sonar scanner. The code smells worked pretty good.
The problem appeared when a function_x was defined in notebook_A but used in notebook_B via %run notebook_a first in notebook_b. In this case using function_x lead to a bug in notebook_B saying it was not defined (although defined imported via notebook_a).
Just for additional context:
The %run command is one of databricks “magic commands”. Databricks’ documentation says that it is roughly equivalent to a python import statement.
Databricks magic commands are currently not supported by our python analyzer. I recommend to simply disable the corresponding rules from your Quality Profile.
Another solution would be to create a Databricks library, the documentation says that import works in this case.
Yes, it is roughly equivalent to a python import statement. I guess will we try your first suggestion and will need to live with the fact that truly non-defined functions will not be detected by the automated code check.