Accessing storage with shared integration Databricks? #207
-
What is the recommended way to access the raw, enrich/cur, workspace data lakes from the shared integration Databricks? Using a service principal and mount point? |
Beta Was this translation helpful? Give feedback.
Answered by
marvinbuss
Nov 11, 2021
Replies: 1 comment 3 replies
-
Hi @baatch, |
Beta Was this translation helpful? Give feedback.
3 replies
Answer selected by
marvinbuss
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi @baatch,
The recommended approach would be to use Credential Passthrough (OAuth 2.0). For automated workflows, you should use a Service Principle and store the secrets (clientId, clientSecret) in a Key Vault backed secret store.
The respective integration and product teams can use their own Key Vault within their RG to store the secrets.