[NEW RELEASE] 42Crunch REST API Static Security Testing 1.0.0

Hi!

We’ve released a first version of our SonarQube plugin “42Crunch REST API Static Security Testing” and would like to have it included in the marketplace.

The plugin is powered by 42Crunch API Contract Security Audit. Security Audit performs a static analysis of the API definition that includes more than 200 checks on best practices and potential vulnerabilities on how the API defines authentication, authorization, transport, and data coming in and going out.

Sonarqube Versions: [7.9,8.5*]
Download URL: https://github.com/42Crunch/sonarqube-plugin/releases/download/v1.0.0/sonar-42crunch-plugin-1.0.jar
Changelog: https://github.com/42Crunch/sonarqube-plugin/releases/tag/v1.0.0
SonarCloud: https://sonarcloud.io/dashboard?id=42Crunch_sonarqube-plugin
PR: https://github.com/SonarSource/sonar-update-center-properties/pull/158

Please let me know if any more information regarding the plugin is required!

Best Regards,
Anton

1 Like

HI Anton,

Welcome to the community!

I’ll need some guidance on how to test this. From reading the link to the underlying technology, it looks like you produce an API definition file for your project and the API Contract Security Audit processes that & produces a report. What’s not clear to me is whether your plugin runs the tool to produce and consume that report or whether you pass the report into analysis. Either way, I’ll need you to point me to a project with all the necessary files in place for testing.

 
Ann

Hi Ann!

please see this repository with OpenAPI examples which you can use for testing: https://github.com/42Crunch/openapi-examples

When our plugin runs, it finds all OpenAPI files in the project (OpenAPI files are JSON or YAML files in a particular format) and sends them to our servers for analysis. Then all issues found during the analysis are reported to SonarQube.

To do the test, a free account on our platform is required. You can get one at https://platform.42crunch.com/login

Once you’ve signed up for the account, please follow these steps to configure the plugin in SonarQube https://docs.42crunch.com/latest/content/tasks/integrate_sonarqube.htm

Regards,
Anton

Hi Anton,

It may be a few days before I come back to this.

 
Ann

Thanks a lot! In any case please let me know if there is anything else you might require.

Anton

Hi Ann!

did you have a chance to have a look at the plugin yet?

Regards,
Anton

Hi Anton,

Sorry. Not yet.

 
Ann

Can I help somehow to speed it up? Perhaps I could send you the authentication token to skip few steps where you’d need to register, etc?

Anton

Hi Anton,

Any advance setup you can provide me will be much appreciated (I’d probably have ended up asking you for it anyway).

The think is that

  • this isn’t my primary responsibility (it’s almost like a work-hobby for me)
  • I have a few other things that are top of mind right now.

I’m sorry it’s taking a while to get to this. Hopefully I can get to this in the next few days.

 
Ann

I appreciate your help Ann! Is there an email address I could send the auth token to? I’d rather not post it on the public forum.

Anton

Hi Anton,

I sent you a private message last week. Did you get it?

Also, could you expand a little on why I need this? And will you provide the same to any other community member who wants to test?

 
Ann

Hi Ann,

I didn’t notice there was a private message! Thanks for following up, I’ve emailed you the token.

So, regarding the test you don’t absolutely require this token do it. You could have followed the steps outlined in the documentation, which means creating a free account on our platform, and then creating the token yourself, but I hope I could save a bit of your time by creating one for you.

Now, the steps to configure the plugin in SonarQube are described in the docs as well, but the gist of it as follows:

  1. Copy plugin jar to extensions/plugins, and restart SonarQube

  2. Go to Administration -> OpenAPI tab in SonarQube Web UI and configure API Token

  3. Clone https://github.com/42Crunch/openapi-examples, create SonarQube project for it and run the analysis (I’ve been using sonar-scanner for it)

    sonar-scanner
    -Dsonar.projectKey=openapi-examples
    -Dsonar.sources=.
    -Dsonar.host.url=http://localhost:9000
    -Dsonar.login=<…>

Regards,
Anton

Hi Ann!

did you have a chance to look at it yet? I guess you’re busy, but I thought I’d ping you just in case!

Regards,
Anton

Hi Anton,

Sorry. I’ve been super busy, but I haven’t forgotten you.

 
Ann

Thanks Ann!

Anton

Hi Anton,

I’ve finally had a moment to look at this, and to some degree I don’t understand what I’m seeing.

  • It appears that the same issues are raised in both the json & yaml files…?
  • You’ve added metrics but I suspect not aggregated them correctly. For the test project you provided, I see an Audit Score of 13 for petstore.json and the same for pestore.yaml. Aaaand a total of 13 for the project. That’s not how it’s supposed to work.
  • What’s the difference supposed to be between Audit Score and Audit Score (data)?
  • And for that matter, what is Audit Score (security) supposed to be? I get a 0 on that.
  • I’m seeing Lines of Code reported for the json and yaml files, but no rules for them. This makes me wonder several things.
    • It seems that you’ve declared an “OpenAPI” language…?
    • Without giving the user the ability to manage its file extensions…? …Okay, I finally read the rest of the docs and found this under OpenAPI -> OpenAPI file suffixes. As you may have guessed by now this is not where I expected to find them. I’m assuming it will be the same for other users
    • How do users control what rules are applied?
  • In fact, on closer examination, I see that step 3 in your docs under “Fine-tune the plugin” is to exclude json & yaml files you don’t want uploaded. Aaand it’s not super clear from the docs that this is a specific configuration under OpenAPI rather than a garden-variety exclusion.
  • How will your plugin interact if the YAML analyzer already in the Marketplace is loaded?

Closing the topic of “things I don’t understand” and moving on…

  • I guess 42crunch is a commercial enterprise. Cool. So is SonarSource. But your plugin readme doesn’t indicate that. It only talks about creating a free account.

  • IMO your README should also be upfront about the fact that project files are uploaded to, processed on, and subsequently stored on an external server

  • The documentation linked to from the readme lists as step 5: “Run SonarQube”. Since SonarQube has to be running to do steps 2-4, I guess you mean “Run analysis”?

  • I see this in my analysis log:

    INFO: Uploading file for security audit: v3.0/petstore.json
    INFO: Uploading file for security audit: v3.0/petstore.yaml
    INFO: Retrieving audit results for: v3.0/petstore.json
    INFO: Retrieving audit results for: v3.0/petstore.yaml

    What happens to analysis if the network is slow or the server unavailable?

  • The documentation says

    By default, the plugin is automatically applied to each SonarQube project on the server.

    That implies

    • there’s a way to make it not apply…? (Or is this the exclusions thing?)
    • this will happen immediately/automatically (the way it’s currently written) when what I think you mean is that each project’s files will be automatically uploaded and processed during analysis
  • I see that analysis is not impeded if the json/yaml files don’t exist. :+1:

  • TBH, I’m a bit confounded by your approach. You’re picking up one or two specific files for parsing in order to raise issues based on their contents. This is the general description of a report-parsing plugin, yet you have structured this like a language analyzer. IMO, you should reconsider your approach: instead of grabbing everything that’s not excluded, set a default “report path” and allow the user to override that with configs, similar to how the JaCoCo plugin works. You can still report issues on files you don’t “own” by extension. Java analysis does that for pom.xml files.

 
Ann

Hi Ann!

thanks a lot for taking time to write a detailed response!

I’ll go and request changes to the docs as per points that you raised, but please let me know if anything else is required besides that.

It appears that the same issues are raised in both the json & yaml files…?

Indeed the content of the files is the same, although one is in JSON format and the other in YAML. My intention for providing samples in two different formats is to demonstrate plugin handling both, while keeping the test repo small. I’ve added a new OpenAPI example “pixi.json” to the repo.

You’ve added metrics but I suspect not aggregated them correctly. For the test project you provided, I see an Audit Score of 13 for petstore.json and the same for pestore.yaml. Aaaand a total of 13 for the project. That’s not how it’s supposed to work.

What’s the difference supposed to be between Audit Score and Audit Score (data)?

And for that matter, what is Audit Score (security) supposed to be? I get a 0 on that.

So, our Audit Score is a metric which can reach maximum 100 points, and is reduced for every issue encountered in the OpenAPI file. It is split into two sections “Security analysis” and “Data validation” which are independent and are summed to calculate total Audit Score More details

This are the scores that our platform provides, and when implementing the SonarQube plugin, we decided to report each as it’s own Metric for every scanned file, and then choose a lowest one for the project.

The reason for doing that is to allow user to configure a Quality Gate to fail if any OpenAPI with the score below of a certain threshold have been encountered.

The “pixi.json” sample I’ve added should get 26 points in security score, 69 points in data and a total score of 95.

I’m seeing Lines of Code reported for the json and yaml files, but no rules for them. This makes me wonder several things.

It seems that you’ve declared an “OpenAPI” language…?

Without giving the user the ability to manage its file extensions…? …Okay, I finally read the rest of the docs and found this under OpenAPI → OpenAPI file suffixes. As you may have guessed by now this is not where I expected to find them. I’m assuming it will be the same for other users

How do users control what rules are applied?

So, there are few points to address: OpenAPI files typically come in files with .json .yaml or .yml files. However, there are many other file types which share these extensions, therefore in order to find OpenAPI files in the project we have to examine every file with one of these extensions.

Each file which have found to have OpenAPI content is uploaded to the 42Crunch platform for analysis, and results are reported to SonarQube.

The extensions and the exclusions for the search are managed under “OpenAPI” tab as you’ve found. You mentioned that it’s not where you’ve expected to find them. Could you be more specific? Originally I wanted to place it under Languages → OpenAPI, but it seemed that the languages list was hardcoded in the UI (please correct me if I’m wrong), so it went into it’s own tab.

In fact, on closer examination, I see that step 3 in your docs under “Fine-tune the plugin” is to exclude json & yaml files you don’t want uploaded. Aaand it’s not super clear from the docs that this is a specific configuration under OpenAPI rather than a garden-variety exclusion.

Yeah, I agree that the docs focus mainly on OpenAPI tab controls. I’ll update these to talk more specifically about OpenAPI → Excluded filepaths.

How will your plugin interact if the YAML analyzer already in the Marketplace is loaded?

It should not interact with it, YAML analyzer provides generic YAML checks (for syntax errors or stying issue) and our plugin focuses specifically on OpenAPI, so there is hardly any overlap.

I guess 42crunch is a commercial enterprise. Cool. So is SonarSource. But your plugin readme doesn’t indicate that. It only talks about creating a free account.

IMO your README should also be upfront about the fact that project files are uploaded to, processed on, and subsequently stored on an external server

Is there anything in particular you would like me to add there? Prerequisite to the plugin use is to signup for the account, which shows TOC, data privacy poilcy, etc.

The documentation linked to from the readme lists as step 5: “Run SonarQube”. Since SonarQube has to be running to do steps 2-4, I guess you mean “Run analysis”?

Yes, thanks for spotting this! We’ll update the docs.

What happens to analysis if the network is slow or the server unavailable?

If the network is slow, the analysis will take longer to complete. In case of network failure, analysis will fail as well.

The documentation says: By default, the plugin is automatically applied to each SonarQube project on the server.

there’s a way to make it not apply…? (Or is this the exclusions thing?)

this will happen immediately/automatically (the way it’s currently written) when what I think you mean is that each project’s files will be automatically uploaded and processed during analysis

The intention for this part of the documentation is to communicate that there are no specific steps needed to enable the plugin for any SonarQube project (besides of copying the plugin jar to lib/extensions).

I guess this is a normal behaviour for any SonarQube plugin, and there is no real need to stress that.

Same as for the analysis, I’ll correct the documentation to reflect this.

TBH, I’m a bit confounded by your approach. You’re picking up one or two specific files for parsing in order to raise issues based on their contents. This is the general description of a report-parsing plugin, yet you have structured this like a language analyzer. IMO, you should reconsider your approach: instead of grabbing everything that’s not excluded, set a default “report path” and allow the user to override that with configs, similar to how the JaCoCo plugin works. You can still report issues on files you don’t “own” by extension. Java analysis does that for pom.xml files.

I’m not sure I agree with that. My understanding is that report parsing plugins consume a file produced by some external tool (code coverage, linter what not) typically saved in a well known location and communicate it’s contents to SonarQube.

In our case the location or the number of OpenAPI files in the project are not known, and there are in some cases projects with tents or hundreds of OpenAPI files in them.

Regards,
Anton

Hi,

When I looked at this yesterday, I didn’t understand what I was dealing with. I think I’ve struggled through to a better understanding today. On that basis, some points from yesterday I’m just going to omit going forward.


In the SonarQube context, I personally find the way these metrics interact terribly confusing. The two sub-metrics add up to a ratings-style worst-wins value. As someone steeped in the way metrics work in SonarQube this seems very counter-intuitive.

There are a couple other things I’m struggling with here. I’ll go in order of increasing importance.

First, as a minor point I don’t understand why the Collection name is set globally and not also at project level.

Also, from what I can see at the moment, your rule titles and rule messages are nearly identical, which is sub-optimal; they should build.

I expected the configuration languages dropdown to be populated with the languages declared in the instance, but it does seem to be hardcoded. I deduce that because I’ve just loaded the YAML plugin, and it doesn’t show up in the list of languages either.

Now that I have YAML loaded I get an error during analysis:

ERROR: Error during SonarQube Scanner execution
ERROR: Language of file 'v3.0/petstore.yaml' can not be decided as the file matches patterns of both sonar.lang.patterns.openapi : **/*.json,**/*.yaml,**/*.yml and sonar.lang.patterns.yaml : **/*.yaml,**/*.yml
ERROR: 
ERROR: Re-run SonarQube Scanner using the -X switch to enable full debug logging.

So it appears that you did declare a language and that you’re claiming the .yaml, .yml, and .json extensions for it.

That means I have a problem with your exclusion configs. You’ve claimed the extensions, so no one else is going to analyze the files if you don’t. Files that shouldn’t be analyzed by you shouldn’t be analyzed at all - i.e. a normal exclusion can be used for this. You don’t need to declare your own configs.

Further, I’m struggling with the fact that you declare a language that claims extensions already claimed by another actively-managed analyzer in the Marketplace. But okay. I guess users can manage that. (Altho you could just as easily declare a dependency on the YAML plugin.)

The really hard thing for me is that you’re uploading the files in question, perhaps tens or hundreds you say, to a 3rd-party server. I’m really thinking that analysis needs to take place locally.

 
Ann

Hi Ann!

In the SonarQube context, I personally find the way these metrics interact terribly confusing. The two sub-metrics add up to a ratings-style worst-wins value. As someone steeped in the way metrics work in SonarQube this seems very counter-intuitive.

I guess closest in terms of SonarQube would be a percent-based metric? “Audit Score” is basically a percentage indicating the quality of an OpenAPI file. I.e. score of 100 indicates very good OpenAPI, while lesser scores indicate not-so-good one.

When implementing the plugin we didn’t use the percentages, because we don’t describe our scores in this manner, and also there is an issue with Data/Security scores which don’t go up to 100.

First, as a minor point I don’t understand why the Collection name is set globally and not also at project level.

Consider it’s a quirk of the implementation, we need to specify collection name to upload APIs to, but results of the analysis are reported to SonarQube in full, so there is rarely a need to view collection itself. We’re changing our APIs at the moment to eliminate the need to specify a collection name at all.

Also, from what I can see at the moment, your rule titles and rule messages are nearly identical, which is sub-optimal; they should build.

What do you mean by “they should build”? In general our rules have titles which have generic description of the issue, and the messages tend to provide more precise description.

For example there is one with with the title:

"Array schema has no maximum number of items defined"

And the message when analysis is run on a particular OpenAPI file:

"Array 'deleteOptions.dryRun' has no maximum number of items defined"

Although, there are many cases where title is the same as the message.

So it appears that you did declare a language and that you’re claiming the .yaml, .yml, and .json extensions for it.

That means I have a problem with your exclusion configs. You’ve claimed the extensions, so no one else is going to analyze the files if you don’t. Files that shouldn’t be analyzed by you shouldn’t be analyzed at all - i.e. a normal exclusion can be used for this. You don’t need to declare your own configs.

I haven’t tested with YAML plugin before, but now I tried it and I see that analysis fails when both plugins are installed. My assumption was that SonarQube would execute both plugins on .yaml files, but that’s not the case.

Unfortunately both JSON and YAML are two very generic data formats which can be used for nearly anything.

For example YAML is used for Kubernetes configs, Github Action workflows, MS Azure declarative pipelines and many others, and most of these use .yaml or .yml file extensions.

It would be great if SonarQube had a way to determine a language of a file by some additional means, not only by inspecting a file extension, but I don’t this anything like that is implemented in SonarQube now?

I think this sort of conflict can only be resolved by a user by choosing to install one or the other plugin and to uninstall the other one.

Further, I’m struggling with the fact that you declare a language that claims extensions already claimed by another actively-managed analyzer in the Marketplace. But okay. I guess users can manage that. (Altho you could just as easily declare a dependency on the YAML plugin.)

I haven’t seen any examples of SonarQube plugins depending on the other plugins? Don’t really like the situation where installing one or the other plugin can fail the analysis, but I haven’t seen how to work it around.

The really hard thing for me is that you’re uploading the files in question, perhaps tens or hundreds you say, to a 3rd-party server. I’m really thinking that analysis needs to take place locally.

Well, this is the way our analysis is implemented - we run it remotely so we won’t have to re-implement it for every platform we integrate with, and we do have quite many integrations: Jenkins, Bitbucket, Bamboo, Github Actions, IntelliJ and VS Code to name a few.

I’m the author of the plugin, and I very much would like to see it published :slight_smile: I’m not a SonarQube expert though, and I’m very willing to tweak the plugin to get it approved. However there are few things which are not under my control - the need to upload files for analysis and the scoring mechanism.

Regards,
Anton

Hi Anton,

After some internal discussions, I’m coming back to you with a refusal based on the upload of files to your 3rd-party server. If you’d like to come back with a version of the plugin that imports a report generated by your server, we’ll be happy to consider it.

 
Ann

1 Like