java.lang.ClassCastException thrown after making squid:S1612 changes

squid:S1612 (Lambdas should be replaced with method references)

I have something called (value -> (value != null) in my project, But sonarQube recommended to change (Objects::nonNull) .

After making above changes application stop working by throwing the java.lang.ClassCastException.

Example:

DataStream configurations = env
  .addSource(createConfigurations(kafkaProps))
  .uid(Constants.KAFKA_SOURCE)
  .filter(value -> (value != null));

Made the changes to

DataStream configurations = env
   .addSource(createConfigurations(kafkaProps))
   .uid(Constants.KAFKA_SOURCE)
   .filter(Objects::nonNull));

Exception: java.lang.ClassCastException

From which package did you import Objects class?

import java.util.Objects;

Could you provide a bit more of the exception context ? (the full stacktrace would be nice). Because at first look the two piece of code you are showing should be really equivalent, so I suspect the issue might come from elsewhere.

Exception :

java.lang.ClassCastException: com.*****.*****.flink.aggregates.Configurations cannot be cast to com.*****.*****.core.entity.Event
	at org.apache.flink.streaming.api.operators.StreamFilter.processElement(StreamFilter.java:39)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:579)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:554)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:534)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:718)
	at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:696)
	at org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:104)
	at org.apache.flink.streaming.connectors.kafka.internals.AbstractFetcher.emitRecord(AbstractFetcher.java:362)
	at org.apache.flink.streaming.connectors.kafka.internals.SimpleConsumerThread.run(SimpleConsumerThread.java:382)

How does this stacktrace relates to the piece of code you showed us in the first post ? ie : where is this exception raised in your code ? I see in the stacktrace that the stream API of kafka is trying to do an invalid cast and I don’t see from the information in your post how this can relate to the change suggested by the rule.

Is this exception really linked to that sole change in your code ?

DataStream and filter both are the part of flink-cep librery.

SonarQube suggesting changes inside the filter to filter(Objects::nonNull)), but problem is import java.util.Objects fails after replacement because it is part of java8 Object.

I am sorry I still struggling to see what could be the issue and how you are linking this to the change of writing this lambda as a method reference.

Would you be able to share a file diff based on the change suggested by SonarQube that lead to the exact stacktrace you indicated ?
I am also interested in the imports in your file.

And finally : which version of Java is used to compile your project ?

Thanks

@avinash.tripathy I’m having same issue. Were you able to get it fixed? If yes, what is the cause?

Based on the thread (DEPRECATED) Apache Flink User Mailing List archive. - Discarding bad data in Stream, it looks like we need to have as filter class and do it in flink way

Hey @abhitej_boorla ,
This thread is almost 2 years old. Please open a new thread with a valid reproducer.

Thanks.
Michael