SonarQube processor behaves wildly when there is ESLint report errors

When any ESLint rule run reports an error, the SonarQube processor fails when reading (or saving, not sure) the reported file:

The given error:

java.lang.IllegalArgumentException: -1 is not a valid line offset for a file

Happens if a lint error points to one line with a given specific error that is signaling the position 0 it crashes, because it’s always trying to subtract 1 (-1) from the “column” - in this example the k/v “column” in the json report file.

In the above report, the present rule "eslint-comments/no-unlimited-disable" marks the error position as 0 in the “column” key, and then Sonarqube tries to access the range -1 to grab the full stretch of data or something like that. And then all this problem happens. Same happened with other rules in our case, like "eslint-comments/no-unused-disable".

If the Sonarqube logic does not check if the position is 0 and then asks for -1, this condition appear.

Below we present one key assumptions about why that is happening. It goes unchecked due to time constraints in our end:

  • File formatting: maybe some file formatting to use UTF-8 - even though our files are in the same format - results in line positions/lengths slightly different, or maybe Sonarqube always sees things one position behind, what makes that one beginning or end of line position always be invalid for it.

What it uses in our issue for marking the line position in the ESLint report, is the field “column”. Our solution devised by one of our developers was:

  • Reduce all columns by -1, so the end of lines could match what Sonarqube “sees” and not an invalid position for him. Not ideal but the only feasible solution for now;
  • Change all columns from 0 to 1, avoiding it to search in the -1 position;
  • Applying relative paths for the files.

The above mentioned approach is what worked out for us in the end. Below we provide the given script that must be run after the yarn lint with node:

const fs = require('fs');

const fileName = 'lintReport.json';

const absolutePathRegexp = /(^|")(\/[a-z0-9_@-]+)+\/TCapp\//gmi;

const column0Regexp = /"column":(\s)?([0-9]{1,})/gmi;

const data = fs.readFileSync(fileName, 'utf-8');

function feedBack(passed, subject) {

// eslint-disable-next-line no-console

console[passed ? 'log' : 'error'](passed ? `Sucessfully removed ${subject}!` : `Something went wrong :(. File still has ${subject}.`);


// Sonar acesses 1 position before declared column, hence on column 0 it breaks trying to acess position -1

// absolute paths dont go well with sonar, so we're transforming those to local references

const fixedData = data.replace(column0Regexp, (subString) => {

const splitData = subString.split(':');

const columnVal = parseInt(splitData[1], 10);

return `${splitData[0]}:${splitData[1][0] === ' ' ? ' ' : ''}${columnVal - 1 >= 1 ? columnVal - 1 : 1}`;

}).replace(absolutePathRegexp, '$1');

feedBack(!fixedData.includes('"column":0'), 'coumn 0s');

feedBack(!absolutePathRegexp.test(fixedData), 'root paths');

fs.writeFileSync(fileName, fixedData, 'utf-8', (err) =>

console.error(err), {

a: 'x',


Thanks in advance for considering this issue fix in future releases. We are under SonarQube Developer Edition Version 9.8 (build 63668).


Hey Douglas,

Thank you for bringing this issue to us. Could you please provide us with a reproducer? You can paste the piece of code that provokes the crash directly into your post.

The issue seems to be that the eslint-comments/no-unused-disable rule returns issues with 0 as column which makes the JS analyzer crash. Is that correct?

What is the link with the paths?

Best Regards,