Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve compile-error feedback messages #998

Open
monoti opened this issue Oct 27, 2021 · 3 comments
Open

Improve compile-error feedback messages #998

monoti opened this issue Oct 27, 2021 · 3 comments
Labels
enhancement New feature or request

Comments

@monoti
Copy link

monoti commented Oct 27, 2021

Summary

When I run sbt stryker, compilation fails due to it tries to mutate either >= or <= operation to ==.
The correct syntax of Spark's condition for equal method is ===.

I have prepared a repo to reproduce the issue: stryker-spark-example.

It will throw below error when you run sbt stryker.

[error] /Users/stryker-spark-example/target/stryker4s-572897741400037258/src/main/scala/SimpleDFHelper.scala:27:32: value or is not a member of Boolean
[error]         df.where($"col1" == 10 or col("col2") <= "B")
[error]                                ^
[error] /Users/stryker-spark-example/target/stryker4s-572897741400037258/src/main/scala/SimpleDFHelper.scala:35:47: type mismatch;
[error]  found   : Boolean
[error]  required: org.apache.spark.sql.Column
[error]         df.where($"col1" >= 10 or col("col2") == "B")
[error]                                               ^
[error] two errors found
[error] (Compile / compileIncremental) Compilation failed

Stryker4s config

stryker4s {
  mutate: [ "src/main/scala/*.scala"]
}

Stryker4s environment

stryker4s 0.14.0

Your Environment

software version(s)
Scala version 2.13.6
Spark version 3.2.0
Build tool & version 1.3.13
Operating System macOS Big Sur

Note that the issue occurs on Scala 2.12.x and Spark 2.4.x as well.

@hugo-vrijswijk
Copy link
Member

Hi, does Stryker4s still continue? Since 0.14.0 Stryker4s is able to deal with compile errors and mark them in the report. We don't have enough type information to know Spark uses ===, but it should still be able to continue

@monoti
Copy link
Author

monoti commented Nov 1, 2021

Hi @hugo-vrijswijk,
Yes, it can continue. Thanks. Is there any plans to make Stryker mutation support Spark?

@hugo-vrijswijk
Copy link
Member

@monoti I don't think that will happen anytime in the near future. Otherwise if we ever do type analysis for every mutant we'd have to check a possible mutation to both == and === which, in Scala, can come from a lot of different places.

The mutant is marked as a compile error, but I think the sbt plugin should have better output so it doesn't show this as an error (it's not, it's just part of the process of removing non-compiling mutants)

@hugo-vrijswijk hugo-vrijswijk changed the title Spark: Mutation does not support for Spark filter syntax Improve compile-error feedback messages Jan 3, 2022
@hugo-vrijswijk hugo-vrijswijk added the enhancement New feature or request label Mar 16, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants