-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Question][DELTA-SHARING-SPARK]: Getting ModuleNotFound Error #2890
Labels
bug
Something isn't working
Comments
aimtsou
changed the title
[BUG][DELTA-SHARING-SPARK]: Getting ModuleNotFound Error
[Question][DELTA-SHARING-SPARK]: Getting ModuleNotFound Error
May 22, 2024
The solution for this is the following. Since we are using Databricks dbfs to store the profile file, we need to do: test_data = profile_file.replace("/dbfs", "dbfs:") + "#{Share}.{Database}.{table}"
delta_sharing.load_as_spark(url=test_data) Furthermore, according to Microsoft when using Azure Databricks you do not need to install any delta-sharing-spark library as it is already supported. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Bug
Following the documentation from 1, 2, results in
ModuleNotFoundError: No module named 'delta.exceptions.captured'; 'delta.exceptions' is not a package
Which Delta project/connector is this regarding?
Describe the problem
While using the open dataset from delta.io and while I am able to list the shared tables
Steps to reproduce
Observed results
Without the library installed we are getting:
With the library installed (io.delta:delta-sharing-spark_2.12:3.1.0) we are getting:
ModuleNotFoundError: No module named 'delta.exceptions.captured'; 'delta.exceptions' is not a package
Expected results
To be able to read delta tables.
Further details
If we investigate in another notebook trying to import
With io.delta:delta-sharing-spark_2.12:3.1.0 gives the same module not found error. Uninstalling the library from the cluster the import is successful. It seems some old examples are using also delta-core as the installed library which has become delta spark 3.1.0 but it does not make a difference.
Trying also this brings the aforementioned problem
Environment information
According to databricks runtime 14.3 LTS page:
Willingness to contribute
I am willing to contribute, in any case since the documentation in the old delta-sharing repository is outdated. The same happens also with the delta sharing server docker image and delta sharing pypi package. On maven both server and client are in 1.0.4
The text was updated successfully, but these errors were encountered: