-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
1011 Running sql using sparkconnect should not print full stack trace #1012
base: master
Are you sure you want to change the base?
1011 Running sql using sparkconnect should not print full stack trace #1012
Conversation
src/sql/util.py
Outdated
@@ -559,6 +559,7 @@ def is_non_sqlalchemy_error(error): | |||
# Pyspark | |||
"UNRESOLVED_ROUTINE", | |||
"PARSE_SYNTAX_ERROR", | |||
"AnalysisException", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
After looking through the code I think adding AnalysisException here will solve the issue since PARSE_SYNTAX_ERROR
works as expected.
AnalysisException covers all these error conditions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just need to test it somehow. Will try to package the jupysql and install it in a spark environment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you can install like this:
pip install git+https://github.com/b1ackout/jupysql@running-sql-using-sparkconnect-should-not-print-full-stack-trace
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tested it but no luck, the error message doesn't contain AnalysisException
, only the sql error conditions listed here which are included in AnalysisException. So instead of included all these in the list (which could also be updated regularly) I think checking if the error is of instance of AnalysisException would be a lot cleaner.
] | ||
return any(msg in str(error) for msg in specific_db_errors) | ||
is_pyspark_analysis_exception = ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If AnalysisException is imported then checks if the error is of instance of pyspark's Analysis Exception and handles it accordingly
@@ -556,11 +562,14 @@ def is_non_sqlalchemy_error(error): | |||
"pyodbc.ProgrammingError", | |||
# Clickhouse errors | |||
"DB::Exception:", | |||
# Pyspark |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed these as they are included in AnalysisException
Describe your changes
short_errors
is enabled to show only the spark sql error and not the full stack traceIssue number
Closes #1011
Checklist before requesting a review
pkgmt format
馃摎 Documentation preview 馃摎: https://jupysql--1012.org.readthedocs.build/en/1012/