WARNING: Illegal reflective access by net.sf.cglib.core.ReflectUtils$1 on sonar-tsql-plugin.jar


I’m getting below Warnings when Trigerred TSQL scan in sonarQube through Jenkins. Is This warnings cause any issues? In out of 9303 files only 335 are getting analysed Can anyone explain what will be the problem.

INFO: 8755 files indexed… (last one was Database/Procedures/usp_MoveIntervalFiles_SQLV2K.sql)
INFO: 9303 files indexed
INFO: Quality profile for tsql: Sonar way
INFO: ------------- Run sensors on module Phar_Sql
INFO: Load metrics repository
INFO: Load metrics repository (done) | time=106ms
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by net.sf.cglib.core.ReflectUtils$1 (file:/C:/windows/system32/config/systemprofile/.sonar/cache/a89f1943fc75b65becd9fb4ecab8d913/sonar-tsql-plugin.jar) to method java.lang.ClassLoader.defineClass(java.lang.String,byte,int,int,java.security.ProtectionDomain)
WARNING: Please consider reporting this to the maintainers of net.sf.cglib.core.ReflectUtils$1
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
All illegal access operations will be denied in a future release
INFO: Sensor JavaXmlSensor [java]
INFO: Sensor JavaXmlSensor [java] (done) | time=50ms
INFO: Sensor HTML [web]
INFO: Sensor HTML [web] (done) | time=31ms
INFO: Sensor cs-vf Checks [codescanlang]
INFO: Running CodeScanLang
INFO: Sensor cs-vf Checks [codescanlang] (done) | time=23ms
INFO: Sensor cs-js Checks [codescanlang]
INFO: Running CodeScanLang
INFO: Sensor cs-js Checks [codescanlang] (done) | time=0ms
INFO: Sensor JaCoCo XML Report Importer [jacoco]
INFO: Sensor JaCoCo XML Report Importer [jacoco] (done) | time=30ms
INFO: Sensor T-SQL Sensor [tsql]
INFO: 335 source files to be analyzed
INFO: Load project repositories
INFO: Load project repositories (done) | time=158ms

The total 9303 files is indexed but only 335 files are analyzed and Is something went wrong? Could anyone help me on this.

Hi Srinivas,

This is just a warning thrown by a dependent library when running our analyzers in a newer JDK and nothing that should affect analysis results.

Are you truly concerned that you have more source code than what is getting analyzed, or reacting only to the scanner output? I ask because the earlier logged statement “9303 files indexed” merely reflects all the files found under the configured source location for the project. ALL the files, including stuff that may or may not be source code. Then the later logging regarding files analyzed by our individual sensors lets you know out of the total files found, which ones are determined to be relevant and within that particular sensor’s scope (e.g. likely all the files ending in .sql for this particular example).

Hope this helps clarify.

Hi Jeff,

The all 9303 files are source file scripts and need to be analysed but sonar scanner it is not getting read. Just want to know is anything missing from my end. The all files are .sql files and they are supposed to be scanned.

Can you post a full log of your analysis with DEBUG activated?

Hello Jeff,

Can’t share the complete logs here can I have your the Email address?

Hi Srinivas,

Thanks for supplying your logs privately. I noticed you’re passing the parameter


I think you confused the syntax normally used for inclusions/exclusions with the syntax for file suffixes. You should just set this to the extensions themselves, like so


Also, the parameter -Dsonar.language=tsql is no longer in use and can be omitted.

Give the adjustment a try.

Hi Jeff,

I have tried above parameter passing and now all 9303 files are getting scanned but my scan getting failed due to huge report.

INFO: CPD Executor CPD calculation finished (done) | time=13144ms
INFO: Analysis report generated in 16623ms, dir size=139 MB
INFO: Analysis report compressed in 39036ms, zip size=51 MB
INFO: ------------------------------------------------------------------------
INFO: ------------------------------------------------------------------------
INFO: Total time: 19:36.582s
INFO: Final Memory: 9M/37M
INFO: ------------------------------------------------------------------------
ERROR: Error during SonarScanner execution
ERROR: Failed to upload report - HTTP code 413:

413 Request Entity Too Large<script type="text/javascript" src="/ruxitagentjs_ICA2SVfqru_1019720071 <p>Also For every file I’m getting below warning message.<br> WARN: Invalid character encountered in file C:/Program Files (x86)/Jenkins/workspace/Sonarqube_Sql/Database/Procedures/sp_SendRequest.PRC at line 93 for encoding UTF-8. Please fix file content or configure the encoding to be used using property ‘sonar.sourceEncoding’.</p> <p>I have already configured the parameter in sonar.properties and sonar.project.properties.<br> sonar.sourceEncoding=UTF-8</p> <p>could you please help on this.</p>

Hi Srinivas,

Usually the root cause of such an HTTP 413 error lies in the settings for the proxy between the scanner and SonarQube. Have you set up something like nginx/apache to serve as a reverse proxy for SonarQube? It will need to be configured to allow for sufficiently large requests. SonarQube has no such limit of its own.

Hi Jeff,

We don’t have nginx or apache in our server. we have IIS configured and
I have tried below solutions till seeing the issue.

The IIS setting uploadReadAheadSize to its maximum allowed value.

Enable “Negotiate Client Certificate” for the certificate bindings

Set maxAllowedContentLength to its maximum value 4294967295

I changed maxRequestEntityAllowed to max(4gb)

Also I have uninstalled IIS and tried to scan the files still getting HTTP 413 error.

@Jeff_Zapotoczny Is below approach will be possible?

  1. Create multiple Jenkins jobs and divide the code into small projects for scanning the code but configure single Sonarqube project for all jenkins jobs.
    will this possible?
    Mutiple jenkins jobs to redirected to single Sonarqube project?

I still think it looks like you’re trying to solve a problem caused by something in your network infrastructure. When you pasted the error earlier, it included reference to some “ruxitagenjs” javascript file. Googling that tells me Ruxit Agent is perhaps part of Dynatrace’s monitoring tool suite. I suggest you talk to your network/system admin team and find out what’s going on.