Want to create python script to analyze third party GitHub repos

I am currently working on a project where I want to analyze GitHub repos of third parties using Sonar Qube.

For this purpose I am trying to create a Python script which is passed a list of URL’s for the repos, and one by one the script passes the URL to the SonarQube API and fetches the analyzed metrics, all of which will then be pushed to a table.

I have been able to analzye code which is on my local machine through the localhost:9000/dashboard, but that is a manual process and the code has to be on my local machine. Even if I go for the GitHub integration, the code has to be on my own GitHub.

Is it possible to analyze third party code located on GitHub using SonarQube? I came across the following module:

But still a bit unclear as to how I can analyze third party code.

Any leads would be greatly appreciated!


You’ll need to clone a repository locally (which surely you could also automate), and then initiate the scan.

Thanks Colin.

Just a follow up question.

If I use Sonar Cloud to analyze a repo and then get the metrics through the web API, I first have to push a YAML and configuration file into the repo on GitHub.

Is there any way I can get that file, or the contents of that file through the web API of SonarCloud? If so, I can then use the GitHub API to push those files to the repo.

Which YAML file are you talking about?

So if I have cloned a repo locally, then what is the process to automatically scan the code files and then obtain a scan report in the form of a CSV or JSON?