New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PySpark - Incompatible parameter type & Unsupported operand #798
Comments
As my inspection, a part(or even all) of this issue caused by pyre only treat a function defined by __add__ = cast(
Callable[["Column", Union["Column", "LiteralType", "DecimalLiteral"]], "Column"],
_bin_op("plus"),
) Here is a more understandable demo for this: class MyInt:
int_val: int = 0
def real_add_func(self, other_int: int) -> int:
return self.int_val + other_int
__add__ = real_add_func
a_int: int = MyInt() + 0 pyre_playground here |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Pyre Bug
Bug description
The
pyspark
dataframe functions generateUnsupported operand [58]
andIncompatible parameter type [6]
even though they are valid and even suggested in the Spark documentation.Reproduction steps
Python snippet
sample.py
:Expected behavior
Running the
pyre check
should not throw any issues as the code is valid.See the docs:
Logs
pyre_rage.log
The text was updated successfully, but these errors were encountered: