Hi everyone!
I’m excited to announce a significant update to the SonarQube MCP Server v1.2!
We’ve been listening to your feedback and focusing on making the integration between your LLMs and SonarQube smoother, faster, and more reliable. This release brings major improvements to how the server handles data, connects to remote instances, and manages resources.
Here is what you can expect in this version:
Performance & Architecture
- Faster Startup Times: We have optimized the initialization process. The server now avoids downloading analyzers unnecessarily, meaning you get up and running much faster. We will keep improving the startup time in future versions.
- Streamable HTTP Support: We added support for Streamable HTTP with remote authentication, enabling more flexible and secure connectivity options for your deployments. It enables companies to host their MCP internally.
Better LLM Interaction
- Structured Output & Schema: We have implemented structured outputs and defined schemas. This is a game-changer for reliability - it ensures that the AI agents consuming our tools understand exactly the shape of the data coming back, reducing hallucinations and parsing errors.
- Improved Tool Responses: The responses generated by the tools are now more concise and informative, helping the LLM maintain better context.
Fixes & Stability
- Enhanced Logging: We’ve improved logging across the board to help you debug connectivity or configuration issues more easily.
- Severities Parameter Fix: We resolved a specific issue where using severities as a tool parameter was not behaving as expected.
- General Bug Fixes: Various under-the-hood squashing of bugs to improve overall stability.
Getting Started
You can pull the latest version via your preferred package manager or Docker image. We recommend updating immediately to take advantage of the performance boosts.
If you find this project useful, please consider giving us a Star
on GitHub - it really helps support the team!
Happy coding!
Alexander