-
Notifications
You must be signed in to change notification settings - Fork 541
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add database to read_ #9165
Comments
Hey @markdruffel-8451 -- thanks for raising this! I think this makes a bunch of sense for the backends where we have catalog/database support, and a As an interim workaround, you can make use of a private context manager to handle setting and unsetting the catalog and database (note that this is a private API and might break without warning, but hopefully won't break before we add the with ispark._active_catalog_database("comms_media_dev", "dart_extensions"):
idf = ispark.read_parquet(source = "abfss:/my_parquet", table_name = "test_table") |
Hey @markdruffel-8451 -- I'm going to keep this open so we can track adding the |
This also applies for |
Is your feature request related to a problem?
The read_ function family allows the user to name a table which writes the file to the default catalog and database.
What is the motivation behind your request?
If I run the code below I get an error,
[[TEMP_VIEW_NAME_TOO_MANY_NAME_PARTS](https://docs.microsoft.com/azure/databricks/error-messages/error-classes#temp_view_name_too_many_name_parts)] CREATE TEMPORARY VIEW or the corresponding Dataset APIs only accept single-part view names, but got: comms_media_dev.dart_extensions.test_table. SQLSTATE: 428EK
.I can easily resolve this by doing the following:
This is only a problem because I'm using ibis in a data pipeline and I don't want concurrent nodes to set current catalog and database outside the write operation itself because they might conflict.
Describe the solution you'd like
Ideally read_ functions would have the database parameter, but allowing table_name to accept
{catalog}.{database}.{table}
would work as well.What version of ibis are you running?
10.0.0.dev49
What backend(s) are you using, if any?
pyspark
Code of Conduct
The text was updated successfully, but these errors were encountered: