Does SonarCloud support a pattern that would allow submission of an arbitrary codebase in conjunction with identified issues (e.g. using CheckStyle, codeclimate, SARIF or generic issues format)?
To elaborate on my use case, I have a repo containing OpenAPI specs (YAML) and user documentation (in MD/MDX format). I’m linting these with external tools: in this case redocly for the OpenAPI docs and Vale for the markdown prose.
I would really like to use SonarCloud to track these issues and gate merges, but even when setting sonar.sources to . none of the code is visible, and submitting issues using sonar.java.checkstyle.reportPaths resulted in no visible Code Smells (maybe this is to be expected, as I’m not analyzing Java code, but using CheckStyle as a generic output format).
For now I’m using GitLab’s code quality functionality as a workaround, but it is significantly less fully-featured when it comes to gating merges, and I’d love to be able to report on issues across our codebase from one place.
Ehm… not an arbitrary one. SonarCloud supports 30-something languages. If your codebase is not in one of them, then you’re out of luck.
We don’t offer direct support for md and mdx files, and I suppose the JS/TS parser is going to barf pretty hard on mdx files, but it might be worth trying. Edit the Project’s file extensions (Administration → General Settings → Languages → [language]) to add your extensions for the target language of choice and then the files should be picked up from analysis.
From there, you can import the 3rd-party issue reports that are recognized for the language. Sarif and the generic format are recognized for all languages, but CheckStyle is only supported for Go and Java.
Thanks for the reply, it does help clarify, although I think it means I won’t be able to use SonarCloud to track quality the way I’d hoped for this particular project. In particular, prose linting would require being able to associate Markdown/text files with a language, and I don’t see an appropriate choice.
It’s odd that my OpenAPI specs aren’t being scanned, as they are pure .yaml files: despite running a branch analysis (not just a PR), the only YAML file that has been imported is a GitLab CI template in a cicd subdirectory.
That said, I’m not sure debugging that much further will be of too much value, as I’ll still be missing any Markdown issues and wouldn’t get a unified view. What I was looking for seems pretty far from Sonar’s current direction, i.e.:
Import “generic” or “text” source
Import issues from external analyzer, using a generic format
ideally even things like CheckStyle could be used here, as it’s not in itself language-dependent: it still ties a leveled warning to a location within a file
Use these imported issues from Sonar to block CI/CD pipelines
We’re working with GitLab as an SCM provider, it sounds like I’ll have to handle a lot of this on that side, I just liked the idea of submitting the analyses to Sonar for centralizing quality gates and reporting.
We are considering OpenAPI specs as something we could check natively.
What do you care the most when linting OpenAPI specs? Only correctness and common errors? More?
Also, what is your motivation for linting your Markdown? Do you have specific constraints, or is it just because you and your team care about well-written prose?
Perhaps you can talk a bit about how this fits into your team workflow?
FYI we scan YAML files since we support several Infrastructure as Code tools that use them. We also support scanning the JavaScript code that might be embedded in those cases. If you want to stop Sonar from looking at those files you can add them to sonar.exclusions in your configuration.
Yep, happy to give any more information here that would be helpful. Our linting of OpenAPI specs is twofold:
We use redocly to check for schema violations and the like. Their recommended rules are documented here: Recommended ruleset
In addition to that, I use prose linting (Vale) on the title, description and summary fields; our rules are pretty closely based on the Microsoft Style Guide. I have to do some jiggery-pokery to extract these fields into separate Markdown files to feed into Vale, then transform its output to correlate the warnings back to the original code location
This hopefully answers your Markdown question too: the process is the same there, but simpler as I can just lint the files directly.
It also gives context to my original request: I can see you saying that the first bullet here fits squarely into SonarCloud’s wheelhouse, and you’d want to take on implementing versions of those rules yourselves.
Prose linting feels less your concern, and it would be really hard to determine which files need to be subjected to this. My sweet spot, however, would be that I could do this externally, and use Sonar to track the warnings against the file location, make that visible anywhere and block merges based on it.
In short, Sonar becomes my central enforcer for any quality concerns, even if they’re being diagnosed by external tools.
I’m doing this in GitLab via its Code Quality functionality at the moment (which can import issues from any tool, e.g. in codeclimate format), but it sucks for blocking based on the specific issues introduced in a branch/MR.
If you want any more info on our use cases, let me know! Excited to see what you come up with.